Nov 28 13:19:06 crc systemd[1]: Starting Kubernetes Kubelet... Nov 28 13:19:06 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:07 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 28 13:19:07 crc kubenswrapper[4747]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.445243 4747 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448804 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448825 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448829 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448834 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448838 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448842 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448847 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448852 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448857 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448861 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448866 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448870 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448874 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448879 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448885 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448890 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448895 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448901 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448906 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448911 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448916 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448921 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448926 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448931 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448936 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448941 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448947 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448952 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448956 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448960 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448963 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448967 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448971 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448974 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448978 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448984 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448988 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448991 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448995 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.448998 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449002 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449005 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449010 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449015 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449018 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449023 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449027 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449032 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449037 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449041 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449046 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449051 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449057 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449062 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449066 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449070 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449075 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449079 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449083 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449088 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449091 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449094 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449098 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449102 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449105 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449109 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449113 4747 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449118 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449122 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449127 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.449133 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449409 4747 flags.go:64] FLAG: --address="0.0.0.0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449422 4747 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449429 4747 flags.go:64] FLAG: --anonymous-auth="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449435 4747 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449441 4747 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449446 4747 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449452 4747 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449458 4747 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449463 4747 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449468 4747 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449473 4747 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449478 4747 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449482 4747 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449487 4747 flags.go:64] FLAG: --cgroup-root="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449491 4747 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449495 4747 flags.go:64] FLAG: --client-ca-file="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449499 4747 flags.go:64] FLAG: --cloud-config="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449503 4747 flags.go:64] FLAG: --cloud-provider="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449507 4747 flags.go:64] FLAG: --cluster-dns="[]" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449512 4747 flags.go:64] FLAG: --cluster-domain="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449516 4747 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449520 4747 flags.go:64] FLAG: --config-dir="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449524 4747 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449530 4747 flags.go:64] FLAG: --container-log-max-files="5" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449537 4747 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449542 4747 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449547 4747 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449552 4747 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449557 4747 flags.go:64] FLAG: --contention-profiling="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449561 4747 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449564 4747 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449569 4747 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449573 4747 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449578 4747 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449582 4747 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449586 4747 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449591 4747 flags.go:64] FLAG: --enable-load-reader="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449595 4747 flags.go:64] FLAG: --enable-server="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449599 4747 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449604 4747 flags.go:64] FLAG: --event-burst="100" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449608 4747 flags.go:64] FLAG: --event-qps="50" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449612 4747 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449617 4747 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449621 4747 flags.go:64] FLAG: --eviction-hard="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449627 4747 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449633 4747 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449637 4747 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449641 4747 flags.go:64] FLAG: --eviction-soft="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449646 4747 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449650 4747 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449654 4747 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449658 4747 flags.go:64] FLAG: --experimental-mounter-path="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449662 4747 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449666 4747 flags.go:64] FLAG: --fail-swap-on="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449671 4747 flags.go:64] FLAG: --feature-gates="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449677 4747 flags.go:64] FLAG: --file-check-frequency="20s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449682 4747 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449686 4747 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449690 4747 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449695 4747 flags.go:64] FLAG: --healthz-port="10248" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449699 4747 flags.go:64] FLAG: --help="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449703 4747 flags.go:64] FLAG: --hostname-override="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449708 4747 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449712 4747 flags.go:64] FLAG: --http-check-frequency="20s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449719 4747 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449723 4747 flags.go:64] FLAG: --image-credential-provider-config="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449727 4747 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449731 4747 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449735 4747 flags.go:64] FLAG: --image-service-endpoint="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449740 4747 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449743 4747 flags.go:64] FLAG: --kube-api-burst="100" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449747 4747 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449752 4747 flags.go:64] FLAG: --kube-api-qps="50" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449756 4747 flags.go:64] FLAG: --kube-reserved="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449761 4747 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449765 4747 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449769 4747 flags.go:64] FLAG: --kubelet-cgroups="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449774 4747 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449777 4747 flags.go:64] FLAG: --lock-file="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449781 4747 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449786 4747 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449790 4747 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449796 4747 flags.go:64] FLAG: --log-json-split-stream="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449800 4747 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449804 4747 flags.go:64] FLAG: --log-text-split-stream="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449808 4747 flags.go:64] FLAG: --logging-format="text" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449812 4747 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449816 4747 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449820 4747 flags.go:64] FLAG: --manifest-url="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449824 4747 flags.go:64] FLAG: --manifest-url-header="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449830 4747 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449834 4747 flags.go:64] FLAG: --max-open-files="1000000" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449839 4747 flags.go:64] FLAG: --max-pods="110" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449843 4747 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449847 4747 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449851 4747 flags.go:64] FLAG: --memory-manager-policy="None" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449856 4747 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449860 4747 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449864 4747 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449868 4747 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449879 4747 flags.go:64] FLAG: --node-status-max-images="50" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449884 4747 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449888 4747 flags.go:64] FLAG: --oom-score-adj="-999" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449892 4747 flags.go:64] FLAG: --pod-cidr="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449896 4747 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449902 4747 flags.go:64] FLAG: --pod-manifest-path="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449906 4747 flags.go:64] FLAG: --pod-max-pids="-1" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449910 4747 flags.go:64] FLAG: --pods-per-core="0" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449915 4747 flags.go:64] FLAG: --port="10250" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449921 4747 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449926 4747 flags.go:64] FLAG: --provider-id="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449930 4747 flags.go:64] FLAG: --qos-reserved="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449934 4747 flags.go:64] FLAG: --read-only-port="10255" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449938 4747 flags.go:64] FLAG: --register-node="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449942 4747 flags.go:64] FLAG: --register-schedulable="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449946 4747 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449953 4747 flags.go:64] FLAG: --registry-burst="10" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449957 4747 flags.go:64] FLAG: --registry-qps="5" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449961 4747 flags.go:64] FLAG: --reserved-cpus="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449965 4747 flags.go:64] FLAG: --reserved-memory="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449971 4747 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449975 4747 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449980 4747 flags.go:64] FLAG: --rotate-certificates="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449985 4747 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449990 4747 flags.go:64] FLAG: --runonce="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449994 4747 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.449998 4747 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450002 4747 flags.go:64] FLAG: --seccomp-default="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450006 4747 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450010 4747 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450014 4747 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450018 4747 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450023 4747 flags.go:64] FLAG: --storage-driver-password="root" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450027 4747 flags.go:64] FLAG: --storage-driver-secure="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450031 4747 flags.go:64] FLAG: --storage-driver-table="stats" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450034 4747 flags.go:64] FLAG: --storage-driver-user="root" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450038 4747 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450042 4747 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450047 4747 flags.go:64] FLAG: --system-cgroups="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450050 4747 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450058 4747 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450062 4747 flags.go:64] FLAG: --tls-cert-file="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450067 4747 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450073 4747 flags.go:64] FLAG: --tls-min-version="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450078 4747 flags.go:64] FLAG: --tls-private-key-file="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450082 4747 flags.go:64] FLAG: --topology-manager-policy="none" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450086 4747 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450094 4747 flags.go:64] FLAG: --topology-manager-scope="container" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450098 4747 flags.go:64] FLAG: --v="2" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450104 4747 flags.go:64] FLAG: --version="false" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450109 4747 flags.go:64] FLAG: --vmodule="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450114 4747 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450119 4747 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450263 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450270 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450274 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450278 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450282 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450285 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450290 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450295 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450299 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450303 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450307 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450310 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450314 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450318 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450321 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450325 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450330 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450334 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450338 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450342 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450347 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450353 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450359 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450364 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450368 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450372 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450380 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450385 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450388 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450392 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450396 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450400 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450403 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450406 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450410 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450413 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450417 4747 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450420 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450424 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450427 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450430 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450434 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450437 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450441 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450445 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450449 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450453 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450456 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450461 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450465 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450469 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450473 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450477 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450481 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450486 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450490 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450494 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450499 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450505 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450510 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450514 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450518 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450522 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450527 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450532 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450536 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450541 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450545 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450549 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450553 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.450556 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.450563 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.466817 4747 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.466871 4747 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467068 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467085 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467096 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467104 4747 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467115 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467128 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467140 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467153 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467163 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467174 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467184 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467194 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467233 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467243 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467250 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467259 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467266 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467275 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467282 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467293 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467306 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467318 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467328 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467338 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467347 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467359 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467367 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467377 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467387 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467395 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467404 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467412 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467463 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467476 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467485 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467495 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467505 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467513 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467522 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467531 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467539 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467547 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467554 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467562 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467570 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467578 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467586 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467593 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467601 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467609 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467616 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467624 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467632 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467640 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467648 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467656 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467663 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467672 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467680 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467689 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467698 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467706 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467715 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467725 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467734 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467742 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467750 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467759 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467766 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467774 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.467782 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.467796 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468067 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468081 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468091 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468100 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468110 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468123 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468132 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468143 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468153 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468163 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468173 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468183 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468191 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468200 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468233 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468241 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468249 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468257 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468266 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468274 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468285 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468294 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468302 4747 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468311 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468319 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468327 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468334 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468342 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468350 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468359 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468366 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468374 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468382 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468390 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468398 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468406 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468413 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468438 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468446 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468454 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468462 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468470 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468477 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468485 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468493 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468501 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468508 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468516 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468523 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468531 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468539 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468546 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468555 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468563 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468575 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468588 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468598 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468608 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468616 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468627 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468637 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468648 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468657 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468666 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468674 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468683 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468691 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468702 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468712 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468722 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.468731 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.468745 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.469386 4747 server.go:940] "Client rotation is on, will bootstrap in background" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.474502 4747 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.474668 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.475544 4747 server.go:997] "Starting client certificate rotation" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.475583 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.476122 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-28 09:37:09.82474843 +0000 UTC Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.476352 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.483046 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.485447 4747 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.486714 4747 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.496914 4747 log.go:25] "Validated CRI v1 runtime API" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.519665 4747 log.go:25] "Validated CRI v1 image API" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.521844 4747 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.526487 4747 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-28-13-15-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.526578 4747 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.557595 4747 manager.go:217] Machine: {Timestamp:2025-11-28 13:19:07.555148917 +0000 UTC m=+0.217630717 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4871ee14-35cb-4f3f-af5a-3f1522596ec5 BootID:b96c0c46-5b5f-49cf-b534-641e4124214f Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:db:fe:e1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:db:fe:e1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:65:34:15 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:98:a0:22 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:20:b2:99 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f1:59:1c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:02:b9:e5:a7:14:e5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:17:8d:bd:b0:40 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.558104 4747 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.558514 4747 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.560036 4747 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.560346 4747 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.560394 4747 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.560714 4747 topology_manager.go:138] "Creating topology manager with none policy" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.560730 4747 container_manager_linux.go:303] "Creating device plugin manager" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.561007 4747 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.561067 4747 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.561378 4747 state_mem.go:36] "Initialized new in-memory state store" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.561615 4747 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.562555 4747 kubelet.go:418] "Attempting to sync node with API server" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.562587 4747 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.562623 4747 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.562643 4747 kubelet.go:324] "Adding apiserver pod source" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.562658 4747 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.565468 4747 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.565469 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.565620 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.565590 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.565710 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.565875 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.566625 4747 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567165 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567224 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567233 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567240 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567253 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567261 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567269 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567284 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567296 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567306 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567319 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567329 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.567762 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.568341 4747 server.go:1280] "Started kubelet" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.568967 4747 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.568986 4747 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.570045 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:07 crc systemd[1]: Started Kubernetes Kubelet. Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.571040 4747 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.573505 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.573563 4747 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.573931 4747 server.go:460] "Adding debug handlers to kubelet server" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.574246 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:18:31.394929949 +0000 UTC Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.574329 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 526h59m23.820606043s for next certificate rotation Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.575929 4747 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.576008 4747 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.576275 4747 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.575667 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c2e324b42cbf4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:19:07.568303092 +0000 UTC m=+0.230784832,LastTimestamp:2025-11-28 13:19:07.568303092 +0000 UTC m=+0.230784832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.577810 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.577827 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.584694 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.584839 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.585645 4747 factory.go:55] Registering systemd factory Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.585859 4747 factory.go:221] Registration of the systemd container factory successfully Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.586839 4747 factory.go:153] Registering CRI-O factory Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.586868 4747 factory.go:221] Registration of the crio container factory successfully Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.586960 4747 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.586997 4747 factory.go:103] Registering Raw factory Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.587250 4747 manager.go:1196] Started watching for new ooms in manager Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.589590 4747 manager.go:319] Starting recovery of all containers Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595715 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595814 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595832 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595851 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595867 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595879 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595891 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595919 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595935 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595949 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595963 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.595975 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596033 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596052 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596069 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596083 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596097 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596112 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596124 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596140 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596158 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596199 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596231 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596252 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596268 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596285 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596351 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596372 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596393 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596409 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596424 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596441 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596458 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596474 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596488 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596504 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596544 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596560 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596577 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596597 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596613 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596628 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596644 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596662 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596676 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596694 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596711 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596725 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596739 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596785 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596799 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596814 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596838 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596854 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596872 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596887 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.596905 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597570 4747 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597599 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597618 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597633 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597645 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597664 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597678 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597693 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597710 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597724 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597740 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597756 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597772 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597786 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597804 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597817 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597829 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597844 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597858 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597873 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597886 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597900 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597915 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597930 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597942 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597957 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597970 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597983 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.597998 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598011 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598029 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598044 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598058 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598078 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598092 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598110 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598163 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598179 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598193 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598225 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598244 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598259 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598273 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598288 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598302 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598316 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598332 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598345 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598372 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598388 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598408 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598422 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598437 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598451 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598464 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598480 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598496 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598510 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598548 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598564 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598581 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598639 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598654 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598668 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598683 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598697 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598710 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598723 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598739 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598752 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598767 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598783 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598797 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598809 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598825 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598841 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598854 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598867 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598884 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598898 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598920 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598938 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598951 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598964 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598979 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.598993 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599006 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599025 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599041 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599057 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599070 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599089 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599102 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599148 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599165 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599179 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599195 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599235 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599251 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599264 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599278 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599293 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599307 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599332 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599346 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599363 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599379 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599394 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599408 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599424 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599481 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599499 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599514 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599527 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599541 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599616 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599631 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599647 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599663 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599680 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599694 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599707 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599723 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599738 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599751 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599769 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599785 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599801 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599815 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599829 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599843 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599858 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599872 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599888 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599902 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599917 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599931 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599946 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599966 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.599984 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600002 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600018 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600041 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600054 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600069 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600084 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600100 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600113 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600129 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600144 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600161 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600175 4747 reconstruct.go:97] "Volume reconstruction finished" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.600187 4747 reconciler.go:26] "Reconciler: start to sync state" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.615873 4747 manager.go:324] Recovery completed Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.630438 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.632200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.632275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.632289 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.634091 4747 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.634115 4747 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.634138 4747 state_mem.go:36] "Initialized new in-memory state store" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.638579 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.640024 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.640076 4747 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.640111 4747 kubelet.go:2335] "Starting kubelet main sync loop" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.640200 4747 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 28 13:19:07 crc kubenswrapper[4747]: W1128 13:19:07.642036 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.642124 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.644511 4747 policy_none.go:49] "None policy: Start" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.646860 4747 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.646919 4747 state_mem.go:35] "Initializing new in-memory state store" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.678092 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.699087 4747 manager.go:334] "Starting Device Plugin manager" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.699179 4747 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.699196 4747 server.go:79] "Starting device plugin registration server" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.700804 4747 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.700864 4747 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.702068 4747 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.702228 4747 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.702240 4747 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.709153 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.741269 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.741379 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.742599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.742658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.742672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.742935 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.743048 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.743082 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744632 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744826 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.744907 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746273 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746524 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.746985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747024 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747228 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747323 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747362 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.747963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748793 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.748901 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.749588 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.749617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.749628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.778456 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.801689 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.802821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.802863 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.802922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.802943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.802959 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803015 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803056 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803170 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803225 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803353 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803365 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803397 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803495 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.803593 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: E1128 13:19:07.804125 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905251 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905331 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905378 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905412 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905482 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905583 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905619 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905709 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905746 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905775 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905597 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905690 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905887 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905843 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905597 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905958 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.905996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906031 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906084 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906098 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906093 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906145 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:07 crc kubenswrapper[4747]: I1128 13:19:07.906241 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.004988 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.006843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.006897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.006916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.006952 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.007610 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.074284 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.081114 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.096472 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.110036 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ec9d596104ba7588240b4c5c92db8b50ea82ad0ae211676a8a7c70bb631fdf96 WatchSource:0}: Error finding container ec9d596104ba7588240b4c5c92db8b50ea82ad0ae211676a8a7c70bb631fdf96: Status 404 returned error can't find the container with id ec9d596104ba7588240b4c5c92db8b50ea82ad0ae211676a8a7c70bb631fdf96 Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.113310 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0b17a361882a47b39b2ebe03c7eadf3f72e6e9b429ceb60c241ee90e6b8ff2a4 WatchSource:0}: Error finding container 0b17a361882a47b39b2ebe03c7eadf3f72e6e9b429ceb60c241ee90e6b8ff2a4: Status 404 returned error can't find the container with id 0b17a361882a47b39b2ebe03c7eadf3f72e6e9b429ceb60c241ee90e6b8ff2a4 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.117501 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.118751 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a3e8cdfb09513ecf92a9701656e5cdfbc62318658959ec85f3b88efc2ed18a06 WatchSource:0}: Error finding container a3e8cdfb09513ecf92a9701656e5cdfbc62318658959ec85f3b88efc2ed18a06: Status 404 returned error can't find the container with id a3e8cdfb09513ecf92a9701656e5cdfbc62318658959ec85f3b88efc2ed18a06 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.122397 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.140366 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a3435787a10956446430a7925a742bccd8a99a9e99b6a9a2b0dd57f3993072d3 WatchSource:0}: Error finding container a3435787a10956446430a7925a742bccd8a99a9e99b6a9a2b0dd57f3993072d3: Status 404 returned error can't find the container with id a3435787a10956446430a7925a742bccd8a99a9e99b6a9a2b0dd57f3993072d3 Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.144650 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8ba0caff35340ae22ecdb2f01f324acd9d553851b3b3f33ceb08a0d5c9fc6d7a WatchSource:0}: Error finding container 8ba0caff35340ae22ecdb2f01f324acd9d553851b3b3f33ceb08a0d5c9fc6d7a: Status 404 returned error can't find the container with id 8ba0caff35340ae22ecdb2f01f324acd9d553851b3b3f33ceb08a0d5c9fc6d7a Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.180638 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.408588 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.410328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.410366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.410376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.410399 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.410983 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.429906 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.429998 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.561816 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.562235 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.571688 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.646409 4747 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8" exitCode=0 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.646526 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.646611 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8ba0caff35340ae22ecdb2f01f324acd9d553851b3b3f33ceb08a0d5c9fc6d7a"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.646707 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.647686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.647731 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3435787a10956446430a7925a742bccd8a99a9e99b6a9a2b0dd57f3993072d3"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.647841 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.647878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.647888 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.648885 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6" exitCode=0 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.648956 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.648979 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a3e8cdfb09513ecf92a9701656e5cdfbc62318658959ec85f3b88efc2ed18a06"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.649086 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.649945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.649978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.649991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.652570 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833" exitCode=0 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.652637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.652706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b17a361882a47b39b2ebe03c7eadf3f72e6e9b429ceb60c241ee90e6b8ff2a4"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.652862 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.653876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.653900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.653913 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.654762 4747 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="97145d734885e2b42870b715ad114b35a3da0d6b5a415acf61b60be71538a532" exitCode=0 Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.654796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"97145d734885e2b42870b715ad114b35a3da0d6b5a415acf61b60be71538a532"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.654833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ec9d596104ba7588240b4c5c92db8b50ea82ad0ae211676a8a7c70bb631fdf96"} Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.654928 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.655033 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.659997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.660040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.660051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.660306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.660371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:08 crc kubenswrapper[4747]: I1128 13:19:08.660388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:08 crc kubenswrapper[4747]: W1128 13:19:08.970301 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.970469 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:08 crc kubenswrapper[4747]: E1128 13:19:08.982262 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Nov 28 13:19:09 crc kubenswrapper[4747]: W1128 13:19:09.129520 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Nov 28 13:19:09 crc kubenswrapper[4747]: E1128 13:19:09.129620 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.211115 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.212865 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.212910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.212923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.212955 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.663170 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.663253 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.663266 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.663396 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.664419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.664452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.664462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.665689 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.665721 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.665734 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.665804 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.666708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.666748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.666761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.667539 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47" exitCode=0 Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.667597 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.667691 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.668656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.668696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.668724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.670372 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.671783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.671822 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.671838 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.671850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.675403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c3e6ddaf27524483928366dd33a7b1b102a1bd247536134bbf1c7e7ad805913a"} Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.675487 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.676262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.676294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:09 crc kubenswrapper[4747]: I1128 13:19:09.676309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.682056 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c" exitCode=0 Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.682159 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c"} Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.683153 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.684383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.684477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.684544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.687833 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.687850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6"} Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.688009 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:10 crc kubenswrapper[4747]: I1128 13:19:10.689768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695118 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416"} Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695242 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695253 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee"} Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695378 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c"} Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695395 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782"} Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.695422 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.696310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.696354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.696370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:11 crc kubenswrapper[4747]: I1128 13:19:11.845145 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.248726 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.248960 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.250980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.251028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.251039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.703733 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704246 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3"} Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.705000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.704987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:12 crc kubenswrapper[4747]: I1128 13:19:12.705098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.659752 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.708846 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.708846 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710592 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:13 crc kubenswrapper[4747]: I1128 13:19:13.710671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:14 crc kubenswrapper[4747]: I1128 13:19:14.711162 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:14 crc kubenswrapper[4747]: I1128 13:19:14.712223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:14 crc kubenswrapper[4747]: I1128 13:19:14.712257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:14 crc kubenswrapper[4747]: I1128 13:19:14.712273 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.203582 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.203781 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.207622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.207672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.207684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.248838 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.248929 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.354912 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.713417 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.714929 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.715005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:15 crc kubenswrapper[4747]: I1128 13:19:15.715026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.117531 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.117887 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.119819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.119883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.119893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.124936 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.403444 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.705328 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:17 crc kubenswrapper[4747]: E1128 13:19:17.709286 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.718121 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.719557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.719621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:17 crc kubenswrapper[4747]: I1128 13:19:17.719644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.026264 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.026570 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.028542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.028597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.028615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.721260 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.724197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.724317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.724344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:18 crc kubenswrapper[4747]: I1128 13:19:18.728465 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:19 crc kubenswrapper[4747]: E1128 13:19:19.214405 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.365541 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.365623 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.571657 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:19:19 crc kubenswrapper[4747]: E1128 13:19:19.648659 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187c2e324b42cbf4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:19:07.568303092 +0000 UTC m=+0.230784832,LastTimestamp:2025-11-28 13:19:07.568303092 +0000 UTC m=+0.230784832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 28 13:19:19 crc kubenswrapper[4747]: E1128 13:19:19.671998 4747 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.724269 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.725318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.725380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:19 crc kubenswrapper[4747]: I1128 13:19:19.725392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.191173 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.191242 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.195877 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.195932 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.815187 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.816999 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.817074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.817097 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:20 crc kubenswrapper[4747]: I1128 13:19:20.817142 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.854768 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.854961 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.855295 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.855334 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.855955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.855991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.856004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:21 crc kubenswrapper[4747]: I1128 13:19:21.859382 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.738382 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.739011 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.739133 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.739889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.740302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.740348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.822360 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:19:22 crc kubenswrapper[4747]: I1128 13:19:22.822456 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:19:23 crc kubenswrapper[4747]: I1128 13:19:23.953882 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 28 13:19:23 crc kubenswrapper[4747]: I1128 13:19:23.974889 4747 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 28 13:19:24 crc kubenswrapper[4747]: I1128 13:19:24.308726 4747 csr.go:261] certificate signing request csr-jqh7p is approved, waiting to be issued Nov 28 13:19:24 crc kubenswrapper[4747]: I1128 13:19:24.319299 4747 csr.go:257] certificate signing request csr-jqh7p is issued Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.187550 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.189041 4747 trace.go:236] Trace[1921632071]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:10.968) (total time: 14220ms): Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[1921632071]: ---"Objects listed" error: 14220ms (13:19:25.188) Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[1921632071]: [14.220923075s] [14.220923075s] END Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.189083 4747 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.190501 4747 trace.go:236] Trace[687408458]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:10.469) (total time: 14720ms): Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[687408458]: ---"Objects listed" error: 14720ms (13:19:25.190) Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[687408458]: [14.720599207s] [14.720599207s] END Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.190543 4747 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.191260 4747 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.191279 4747 trace.go:236] Trace[87992039]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:11.381) (total time: 13809ms): Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[87992039]: ---"Objects listed" error: 13809ms (13:19:25.191) Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[87992039]: [13.809851313s] [13.809851313s] END Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.191336 4747 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.198002 4747 trace.go:236] Trace[1739375473]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:11.498) (total time: 13699ms): Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[1739375473]: ---"Objects listed" error: 13699ms (13:19:25.197) Nov 28 13:19:25 crc kubenswrapper[4747]: Trace[1739375473]: [13.69947252s] [13.69947252s] END Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.198036 4747 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.230269 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.246449 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.249300 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.249370 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.321019 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-28 13:14:24 +0000 UTC, rotation deadline is 2026-10-16 11:19:36.539468297 +0000 UTC Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.321089 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7726h0m11.218383662s for next certificate rotation Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.574059 4747 apiserver.go:52] "Watching apiserver" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.579836 4747 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.580379 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.580868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.581055 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.581147 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.581189 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.581255 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.581338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.581369 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.581291 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.581501 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.584053 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600032 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600077 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600118 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600503 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600577 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.600811 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.601226 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.609708 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.629469 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.640654 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.662148 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.674996 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.677984 4747 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.691474 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694574 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694589 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694626 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694642 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694676 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694706 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694725 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694741 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694771 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694792 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694814 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694832 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694868 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694891 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694927 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694950 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.694972 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695005 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695025 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695094 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695119 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695137 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695185 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695232 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695259 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695277 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695312 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695352 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695392 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695417 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695439 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695484 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695784 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695812 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695855 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695878 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695898 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695937 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.695977 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696021 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696065 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696104 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696124 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696181 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696235 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696259 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696304 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696332 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696357 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696508 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696544 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696907 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.696913 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697058 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697091 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697091 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697127 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697137 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697202 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697270 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.697369 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:26.197335496 +0000 UTC m=+18.859817226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697439 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697646 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.697947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698245 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698314 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698325 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698398 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698424 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698449 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698498 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698503 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698523 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698578 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698601 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698626 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698650 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698673 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698699 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698752 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698774 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698849 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698874 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698898 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698944 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698968 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.698992 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699021 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699047 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699075 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699084 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699098 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699128 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699156 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699181 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699227 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699258 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699286 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699290 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699316 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699344 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699378 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699403 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699425 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699451 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699461 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699479 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699575 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699712 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699794 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699819 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699849 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699875 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699907 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699938 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699966 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.699993 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700017 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700049 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700075 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700099 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700126 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700152 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700156 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700176 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700221 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700253 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700282 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700308 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700354 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700381 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700408 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700436 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700462 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700486 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700514 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700542 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700573 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700598 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700623 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700648 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700748 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700781 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700804 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700827 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700854 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700877 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700902 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700928 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700984 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701015 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701042 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701067 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701116 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701168 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701193 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701235 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701311 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701335 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701358 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701385 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701413 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701437 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701462 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701520 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701548 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701569 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701644 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701693 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701717 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701746 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701774 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701865 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701932 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.701965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702003 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702063 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702104 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702172 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702224 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702254 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702282 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702314 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702407 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702425 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702441 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702457 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702472 4747 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702487 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702500 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702517 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702532 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702545 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702559 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702573 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702587 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702600 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702612 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703900 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700355 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710606 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700502 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700657 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710781 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702134 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702643 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.702954 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703284 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703426 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703518 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703464 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703563 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703811 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.703953 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704017 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704460 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704487 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704558 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704918 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704937 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.704989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.705001 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711286 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.705286 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.706036 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.706311 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.706630 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.706784 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.706850 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.707097 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.707561 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.707898 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.708297 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.708699 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.708700 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.708777 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709004 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709054 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709431 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709477 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709503 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709656 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709686 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709808 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.709912 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710189 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710411 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710507 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710572 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.700808 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.710898 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711605 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711919 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.712056 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.711988 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.712389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.712498 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.712947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.713812 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714050 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714127 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.714473 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714485 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714509 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714831 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714898 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.714980 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715244 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715345 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715458 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715627 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715673 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.715836 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.716122 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.716376 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.716525 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.716732 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.716889 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717241 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717289 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717335 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717425 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717726 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717906 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.717957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.718259 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.718278 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.718578 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.718856 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.718893 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.718934 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:26.218868098 +0000 UTC m=+18.881350058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.719014 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.719080 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.719330 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.719444 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:26.219416092 +0000 UTC m=+18.881897822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.719512 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.719826 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.719816 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.720421 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.720546 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.720734 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.721101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.721148 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.721398 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.721668 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.722273 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.722676 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.722753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.722833 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.722984 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.723333 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.723570 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.723758 4747 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.724237 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.724769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.725301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.725498 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.725597 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.725869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.725885 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.726285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.726512 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.726524 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.726769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.726833 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.727191 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.727301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.727564 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.728225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.728619 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.729115 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.729166 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.729825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.729836 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.730180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.730318 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.730872 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.731200 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.736263 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.736564 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.736816 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737054 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737149 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737263 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737072 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737380 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737404 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737475 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:26.237452802 +0000 UTC m=+18.899934742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.737341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.737613 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:26.237587876 +0000 UTC m=+18.900069606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.737315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.741838 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.742790 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.742824 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.743348 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.745383 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.745664 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.746858 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.747350 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.747044 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.747470 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.748226 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.749311 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.750245 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.750688 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.750582 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.751812 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6" exitCode=255 Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.751867 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.751948 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6"} Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.752134 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.760850 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.761474 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.762900 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.765492 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.765800 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.766770 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.775590 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.775657 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.775766 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.776565 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.776586 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.776675 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.776761 4747 scope.go:117] "RemoveContainer" containerID="13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.776810 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.777289 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.777443 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.777718 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.777757 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.781278 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.782952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.783425 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.783598 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.784939 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.787064 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.787852 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.787890 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803243 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803922 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803968 4747 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803978 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803988 4747 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.803998 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804007 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804015 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804024 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804032 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804041 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804050 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804059 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804066 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804074 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804083 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804093 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804103 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804118 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804127 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804135 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804143 4747 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804151 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804159 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804168 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804176 4747 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804184 4747 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804192 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804201 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804232 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804241 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804249 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804257 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804265 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804274 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804282 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804290 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804299 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804307 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804315 4747 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804323 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804332 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804340 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804348 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804356 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804363 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804371 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804379 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804388 4747 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804396 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804404 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804412 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804420 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804428 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804436 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804444 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804452 4747 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804460 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804469 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804476 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804484 4747 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804492 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804500 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804508 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804516 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804524 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804533 4747 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804541 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804548 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804557 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804566 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804574 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804582 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804591 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804599 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804608 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804615 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804596 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804666 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804623 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804710 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804723 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804734 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804745 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804756 4747 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804767 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804776 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804787 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804796 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804806 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804816 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804826 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804836 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804848 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804858 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804868 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804877 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804889 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804899 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804912 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804924 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804935 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804944 4747 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804953 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804965 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804976 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.804988 4747 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805001 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805012 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805022 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805033 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805043 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805053 4747 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805063 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805075 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805084 4747 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805094 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805106 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805121 4747 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805132 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805144 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805159 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805168 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805178 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805189 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805198 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805225 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805235 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805245 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805256 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805266 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805276 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805287 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805298 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805309 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805321 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805331 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805342 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805352 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805362 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805373 4747 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805383 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805393 4747 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805403 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805412 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805423 4747 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805434 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805446 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805459 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805469 4747 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805482 4747 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805491 4747 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805502 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805513 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805523 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805534 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805545 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805555 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805565 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805575 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805585 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805596 4747 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805608 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805620 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805634 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805643 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805654 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805663 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805673 4747 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805684 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805694 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805705 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805715 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805725 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805735 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805748 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805758 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805768 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805779 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.805789 4747 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.823184 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: E1128 13:19:25.830045 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.830225 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.834725 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.844569 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.847983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.860966 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.875827 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.890146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.895756 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.903803 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.904277 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.906761 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.906867 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.906941 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 28 13:19:25 crc kubenswrapper[4747]: W1128 13:19:25.912798 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cac6d909ee836a1d328f5279f6bee6bc837afcb7467acfd66f64388001794b1b WatchSource:0}: Error finding container cac6d909ee836a1d328f5279f6bee6bc837afcb7467acfd66f64388001794b1b: Status 404 returned error can't find the container with id cac6d909ee836a1d328f5279f6bee6bc837afcb7467acfd66f64388001794b1b Nov 28 13:19:25 crc kubenswrapper[4747]: I1128 13:19:25.913114 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:19:25 crc kubenswrapper[4747]: W1128 13:19:25.918399 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-dd188611203f451e4b424c986007b48254170e048cbc7a99d0f9ff510a72993e WatchSource:0}: Error finding container dd188611203f451e4b424c986007b48254170e048cbc7a99d0f9ff510a72993e: Status 404 returned error can't find the container with id dd188611203f451e4b424c986007b48254170e048cbc7a99d0f9ff510a72993e Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.209765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.210026 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:27.209995603 +0000 UTC m=+19.872477333 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.310390 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.310456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.310481 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.310505 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310595 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310612 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310627 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310624 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310676 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:27.310652938 +0000 UTC m=+19.973134668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310641 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310716 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310739 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:27.310708929 +0000 UTC m=+19.973190849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310748 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310764 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310766 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:27.31075528 +0000 UTC m=+19.973237240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:26 crc kubenswrapper[4747]: E1128 13:19:26.310830 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:27.310810522 +0000 UTC m=+19.973292242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.548005 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t9h2n"] Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.548421 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.551152 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.551352 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.551458 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.551954 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zbzpq"] Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.552336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.554520 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.555622 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.556499 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.556815 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.557250 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.566162 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.581425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.603868 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c533d335-7419-4f71-857b-2dbf2274a2cd-hosts-file\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613779 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pht9x\" (UniqueName: \"kubernetes.io/projected/c533d335-7419-4f71-857b-2dbf2274a2cd-kube-api-access-pht9x\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwcq\" (UniqueName: \"kubernetes.io/projected/bc55136c-24a8-4913-b8b9-afe93e54fd83-kube-api-access-hmwcq\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc55136c-24a8-4913-b8b9-afe93e54fd83-rootfs\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613865 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc55136c-24a8-4913-b8b9-afe93e54fd83-proxy-tls\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.613885 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc55136c-24a8-4913-b8b9-afe93e54fd83-mcd-auth-proxy-config\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.619702 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.633617 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.647273 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.661452 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.688701 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.705740 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c533d335-7419-4f71-857b-2dbf2274a2cd-hosts-file\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pht9x\" (UniqueName: \"kubernetes.io/projected/c533d335-7419-4f71-857b-2dbf2274a2cd-kube-api-access-pht9x\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714591 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwcq\" (UniqueName: \"kubernetes.io/projected/bc55136c-24a8-4913-b8b9-afe93e54fd83-kube-api-access-hmwcq\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714636 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc55136c-24a8-4913-b8b9-afe93e54fd83-rootfs\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc55136c-24a8-4913-b8b9-afe93e54fd83-proxy-tls\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714690 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc55136c-24a8-4913-b8b9-afe93e54fd83-mcd-auth-proxy-config\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.714657 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c533d335-7419-4f71-857b-2dbf2274a2cd-hosts-file\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.715503 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc55136c-24a8-4913-b8b9-afe93e54fd83-rootfs\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.715840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc55136c-24a8-4913-b8b9-afe93e54fd83-mcd-auth-proxy-config\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.718819 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.718840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc55136c-24a8-4913-b8b9-afe93e54fd83-proxy-tls\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.731466 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.733398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwcq\" (UniqueName: \"kubernetes.io/projected/bc55136c-24a8-4913-b8b9-afe93e54fd83-kube-api-access-hmwcq\") pod \"machine-config-daemon-zbzpq\" (UID: \"bc55136c-24a8-4913-b8b9-afe93e54fd83\") " pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.744659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pht9x\" (UniqueName: \"kubernetes.io/projected/c533d335-7419-4f71-857b-2dbf2274a2cd-kube-api-access-pht9x\") pod \"node-resolver-t9h2n\" (UID: \"c533d335-7419-4f71-857b-2dbf2274a2cd\") " pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.752304 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.754817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.754865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dd188611203f451e4b424c986007b48254170e048cbc7a99d0f9ff510a72993e"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.760018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.760075 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.760086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cac6d909ee836a1d328f5279f6bee6bc837afcb7467acfd66f64388001794b1b"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.761739 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.763058 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.763184 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.764226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3fc3b7def14d516abeb33ac58f47fbb25059a646356881313f41e10fbeb870f9"} Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.766047 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.784093 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.802690 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.843896 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.861327 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t9h2n" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.866664 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.874444 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.907681 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:26 crc kubenswrapper[4747]: I1128 13:19:26.984969 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:26Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.018096 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.019965 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-78psz"] Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.020459 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.022065 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6sv29"] Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.022814 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.024962 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.025296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.025428 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.025607 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.025761 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.026753 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.029132 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.049757 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.067783 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.082216 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.099233 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.114983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120378 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-system-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120406 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-conf-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120422 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-daemon-config\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120457 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kphz\" (UniqueName: \"kubernetes.io/projected/11d91e3e-309b-4e83-9b0c-1f589c7670f6-kube-api-access-8kphz\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120486 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120504 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-kubelet\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120517 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-etc-kubernetes\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cni-binary-copy\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-hostroot\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120572 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cnibin\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120674 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120756 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-socket-dir-parent\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-multus\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120828 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-k8s-cni-cncf-io\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120847 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gw4\" (UniqueName: \"kubernetes.io/projected/c6d63baf-0ac0-4940-bd10-3ca1967456ca-kube-api-access-j6gw4\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-bin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120907 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-system-cni-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.120963 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-os-release\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.121021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-os-release\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.121046 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-multus-certs\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.121072 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cnibin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.121093 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-netns\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.132436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.154635 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.185472 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.215152 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221501 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221660 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-multus\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-socket-dir-parent\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-k8s-cni-cncf-io\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gw4\" (UniqueName: \"kubernetes.io/projected/c6d63baf-0ac0-4940-bd10-3ca1967456ca-kube-api-access-j6gw4\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-bin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221785 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-system-cni-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221826 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-os-release\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-multus-certs\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221879 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-os-release\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cnibin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221913 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-netns\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221931 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kphz\" (UniqueName: \"kubernetes.io/projected/11d91e3e-309b-4e83-9b0c-1f589c7670f6-kube-api-access-8kphz\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221948 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-system-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221966 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-conf-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221983 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-daemon-config\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.221998 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-kubelet\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222030 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-etc-kubernetes\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222065 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cni-binary-copy\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222087 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-hostroot\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222109 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cnibin\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222129 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.222536 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:29.222510275 +0000 UTC m=+21.884992005 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222590 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-multus\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-conf-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-socket-dir-parent\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.222962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-kubelet\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223046 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223102 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cnibin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223109 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-etc-kubernetes\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223137 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-system-cni-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223141 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-netns\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-k8s-cni-cncf-io\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223185 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-var-lib-cni-bin\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223273 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-host-run-multus-certs\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223267 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-system-cni-dir\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cnibin\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223283 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6d63baf-0ac0-4940-bd10-3ca1967456ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223343 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-os-release\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223303 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-hostroot\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223565 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6d63baf-0ac0-4940-bd10-3ca1967456ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d91e3e-309b-4e83-9b0c-1f589c7670f6-os-release\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-cni-binary-copy\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.223983 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11d91e3e-309b-4e83-9b0c-1f589c7670f6-multus-daemon-config\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.237264 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.255122 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.271489 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.283890 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.298986 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.314558 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.321563 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kphz\" (UniqueName: \"kubernetes.io/projected/11d91e3e-309b-4e83-9b0c-1f589c7670f6-kube-api-access-8kphz\") pod \"multus-78psz\" (UID: \"11d91e3e-309b-4e83-9b0c-1f589c7670f6\") " pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.321606 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gw4\" (UniqueName: \"kubernetes.io/projected/c6d63baf-0ac0-4940-bd10-3ca1967456ca-kube-api-access-j6gw4\") pod \"multus-additional-cni-plugins-6sv29\" (UID: \"c6d63baf-0ac0-4940-bd10-3ca1967456ca\") " pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.323528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.323587 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.323625 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.323656 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323658 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323743 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:29.323723544 +0000 UTC m=+21.986205264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323741 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323789 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:29.323782485 +0000 UTC m=+21.986264215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323790 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323819 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323838 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323840 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323882 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323903 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323902 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:29.323880548 +0000 UTC m=+21.986362478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.323980 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:29.32396268 +0000 UTC m=+21.986444410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.328512 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.349972 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.357404 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-78psz" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.365577 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6sv29" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.367981 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.369466 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d91e3e_309b_4e83_9b0c_1f589c7670f6.slice/crio-a28c119365c1ab136f4eb16abd626552ff8d5a4fcf4a35201d43c78be91933b4 WatchSource:0}: Error finding container a28c119365c1ab136f4eb16abd626552ff8d5a4fcf4a35201d43c78be91933b4: Status 404 returned error can't find the container with id a28c119365c1ab136f4eb16abd626552ff8d5a4fcf4a35201d43c78be91933b4 Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.383176 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d63baf_0ac0_4940_bd10_3ca1967456ca.slice/crio-945a1de4c29a73d5ce1be28799fc4ed61d50d06df4df8bb7312258f90d03b0ac WatchSource:0}: Error finding container 945a1de4c29a73d5ce1be28799fc4ed61d50d06df4df8bb7312258f90d03b0ac: Status 404 returned error can't find the container with id 945a1de4c29a73d5ce1be28799fc4ed61d50d06df4df8bb7312258f90d03b0ac Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.393677 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.410105 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.432425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.441247 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hb2vp"] Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.442095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.444522 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.444853 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.445244 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.445379 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.445507 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.445663 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.446399 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.466574 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.476411 4747 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.476717 4747 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.476811 4747 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.476850 4747 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.476883 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.477820 4747 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.477861 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.477893 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.478560 4747 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.478599 4747 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.478990 4747 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479019 4747 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479052 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479274 4747 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479656 4747 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479901 4747 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479908 4747 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479930 4747 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.479939 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.480007 4747 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.480071 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.480099 4747 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.480092 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/network-operator-58b4c7f79c-55gtf/status\": read tcp 38.102.83.143:49070->38.102.83.143:6443: use of closed network connection" Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.480289 4747 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.506274 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526360 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rlqb\" (UniqueName: \"kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526475 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526498 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526523 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526581 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526599 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526614 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526634 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526687 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526700 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526732 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526779 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526793 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.526836 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.527023 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.544458 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.582652 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.604496 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.621329 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627477 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627527 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627544 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627584 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627607 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627626 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627638 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627727 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627762 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627836 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627888 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627917 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627912 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.627937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rlqb\" (UniqueName: \"kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628063 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628140 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628278 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628294 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628340 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628347 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628378 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628461 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628499 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628539 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.628985 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.629076 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.629107 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.629878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.629923 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.630332 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.635799 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.640907 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.641069 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.641172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.641251 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.641338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:27 crc kubenswrapper[4747]: E1128 13:19:27.641399 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.649482 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.650182 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.651496 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.652141 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.653286 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.654053 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.654786 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.655456 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.659917 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.660562 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.661567 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.662305 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.663305 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.663871 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.664755 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.665027 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.665616 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.666258 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.667402 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.668021 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.671167 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.671865 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.672481 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.673438 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.674145 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.674638 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.675570 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rlqb\" (UniqueName: \"kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb\") pod \"ovnkube-node-hb2vp\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.675812 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.677916 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.678523 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.688387 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.689128 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.689490 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.690084 4747 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.690268 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.694021 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.695467 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.696296 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.700245 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.701430 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.702678 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.703908 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.707191 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.708223 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.710618 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.711967 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.715621 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.716436 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.717521 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.719517 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.722308 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.723370 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.724399 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.725006 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.727120 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.727981 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.728838 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.730118 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.739614 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.755392 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.771863 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: W1128 13:19:27.775060 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52417df_b828_4251_a786_afae5d1aa9fd.slice/crio-30df8e79cce6aecfda6809ed6a9210cf4df4b6bda193a518a8aeee17eadc3bec WatchSource:0}: Error finding container 30df8e79cce6aecfda6809ed6a9210cf4df4b6bda193a518a8aeee17eadc3bec: Status 404 returned error can't find the container with id 30df8e79cce6aecfda6809ed6a9210cf4df4b6bda193a518a8aeee17eadc3bec Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.776116 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerStarted","Data":"65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.776193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerStarted","Data":"a28c119365c1ab136f4eb16abd626552ff8d5a4fcf4a35201d43c78be91933b4"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.778470 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.778531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.778544 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"3fe52c79921aa5157e5e3f6c5982c0c5f411113a17f2a37d207c58acb62e0956"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.779711 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t9h2n" event={"ID":"c533d335-7419-4f71-857b-2dbf2274a2cd","Type":"ContainerStarted","Data":"420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.779739 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t9h2n" event={"ID":"c533d335-7419-4f71-857b-2dbf2274a2cd","Type":"ContainerStarted","Data":"28e18eb9467aa61aece1344bd8c8de144c48c64efff7882265790fc969963a88"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.781098 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerStarted","Data":"945a1de4c29a73d5ce1be28799fc4ed61d50d06df4df8bb7312258f90d03b0ac"} Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.790861 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.802813 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.817924 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.833330 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.848041 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.864324 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.878914 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.890602 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.906541 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.927559 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.948861 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.961315 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:27 crc kubenswrapper[4747]: I1128 13:19:27.978495 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.001626 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.022060 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.034823 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.053775 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.074601 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.100903 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.127422 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.173800 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.212844 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.254414 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.290068 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.325040 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.355593 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.376757 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.404769 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.486253 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.519953 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.563523 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.644340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.683078 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.687765 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.689282 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.695440 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.700392 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.783190 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.791520 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2" exitCode=0 Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.791585 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2"} Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.793513 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" exitCode=0 Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.793607 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.793652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"30df8e79cce6aecfda6809ed6a9210cf4df4b6bda193a518a8aeee17eadc3bec"} Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.795334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172"} Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.806344 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.811255 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.817196 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.823427 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.839037 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.844458 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.861091 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.882313 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.882362 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.923146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:28Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.936075 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 28 13:19:28 crc kubenswrapper[4747]: I1128 13:19:28.977028 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.007865 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.016056 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.031049 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.033847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.033897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.033915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.034130 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.035615 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.056703 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.097135 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.156182 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.159234 4747 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.159566 4747 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.161034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.161203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.161367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.161497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.161618 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.185417 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.190344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.190407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.190424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.190449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.190463 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.204675 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.208390 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.209332 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.209482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.209606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.209754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.209901 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.244522 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.248973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.249168 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:33.24913488 +0000 UTC m=+25.911616730 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.249359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.249386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.249394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.249414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.249426 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.253396 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.263483 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.266659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.266692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.266703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.266723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.266738 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.285719 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.285908 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.286472 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.288297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.288332 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.288349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.288375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.288405 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.331953 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.350118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.350170 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.350199 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.350258 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350406 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350431 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350444 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350494 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:33.350480462 +0000 UTC m=+26.012962192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350551 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350560 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350567 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350588 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:33.350582495 +0000 UTC m=+26.013064225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350712 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350778 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350842 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:33.350810861 +0000 UTC m=+26.013292761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.350886 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:33.350859932 +0000 UTC m=+26.013341662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.369653 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.390627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.390668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.390680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.390700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.390715 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.407958 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.451772 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.490343 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.492909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.492956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.492966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.492988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.493001 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.524702 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.563460 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.595739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.595769 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.595780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.595803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.595814 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.614445 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.640918 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.641101 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.641195 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.641272 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.641338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:29 crc kubenswrapper[4747]: E1128 13:19:29.641396 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.654002 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.684955 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lh4kn"] Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.685686 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.695333 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.696036 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.698544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.698579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.698589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.698606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.698617 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.716258 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.735540 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.754014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-host\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.754311 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-serviceca\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.754334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9h9n\" (UniqueName: \"kubernetes.io/projected/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-kube-api-access-j9h9n\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.757287 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.800751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.800792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.800805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.800822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.800834 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803928 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803942 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803954 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803967 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.803977 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.806377 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae" exitCode=0 Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.806582 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.819353 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.844613 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.855021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9h9n\" (UniqueName: \"kubernetes.io/projected/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-kube-api-access-j9h9n\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.855255 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-host\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.855358 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-serviceca\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.855381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-host\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.857029 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-serviceca\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.896496 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9h9n\" (UniqueName: \"kubernetes.io/projected/aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce-kube-api-access-j9h9n\") pod \"node-ca-lh4kn\" (UID: \"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\") " pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.910572 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.915677 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.915739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.915750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.915772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.915785 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:29Z","lastTransitionTime":"2025-11-28T13:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.945752 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:29 crc kubenswrapper[4747]: I1128 13:19:29.987723 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:29Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.000247 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lh4kn" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.019154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.019185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.019194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.019223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.019235 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.030486 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.068173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.104371 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.122361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.122412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.122421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.122448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.122460 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.144840 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.183621 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.226115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.226192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.226218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.226241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.226256 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.232019 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.263992 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.306258 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.331540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.331670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.331683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.331700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.331710 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.346408 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.385544 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.423883 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.434445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.434493 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.434505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.434524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.434538 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.472374 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.506201 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.536878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.536938 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.536955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.536980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.536995 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.545798 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.640118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.640172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.640183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.640222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.640237 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.742834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.742899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.742916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.742949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.742967 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.812418 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f" exitCode=0 Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.812497 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.814734 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lh4kn" event={"ID":"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce","Type":"ContainerStarted","Data":"ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.814796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lh4kn" event={"ID":"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce","Type":"ContainerStarted","Data":"8a2744d712db46ee14b70a0989c0e9712dd72641f37597ec5f6524987c8601aa"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.828333 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.842487 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.845505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.845563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.845576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.845598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.845611 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.857997 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.869451 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.890813 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.909084 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.921704 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.937848 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.949267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.949318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.949328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.949350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.949368 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:30Z","lastTransitionTime":"2025-11-28T13:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:30 crc kubenswrapper[4747]: I1128 13:19:30.953006 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:30Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.010509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.053725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.053776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.053792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.053812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.053824 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.059665 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.077069 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.089761 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.105389 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.146073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.156342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.156383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.156396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.156426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.156440 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.186762 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.227760 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.259322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.259377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.259420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.259452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.259467 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.271772 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.304592 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.346751 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.363589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.363671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.363691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.363721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.363740 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.385651 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.424240 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466057 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.466552 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.516343 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.547031 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.569721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.569790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.569803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.569827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.569840 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.592631 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.629569 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.641423 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.641472 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.641423 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:31 crc kubenswrapper[4747]: E1128 13:19:31.641561 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:31 crc kubenswrapper[4747]: E1128 13:19:31.641622 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:31 crc kubenswrapper[4747]: E1128 13:19:31.641690 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.669171 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.673635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.673752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.673813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.673875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.673939 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.777596 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.778069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.778083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.778103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.778133 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.825367 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170" exitCode=0 Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.825901 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.832341 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.847769 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.864148 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.882932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.882974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.882988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.883011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.883024 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.883785 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.899777 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.914696 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.929883 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.947225 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.986594 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:31Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.987042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.987067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.987075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.987091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:31 crc kubenswrapper[4747]: I1128 13:19:31.987102 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:31Z","lastTransitionTime":"2025-11-28T13:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.032627 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.065120 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.090454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.090515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.090524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.090543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.090556 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.110449 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.147398 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.185893 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.208456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.208488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.208499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.208521 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.208534 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.231604 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.256646 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.260919 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.267439 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.283301 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.311543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.311600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.311613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.311632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.311645 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.325348 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.367927 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.403177 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.421333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.421374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.421383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.421399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.421410 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.445890 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.484509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.523991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.524318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.524388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.524478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.524637 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.527727 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.568703 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.620907 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.626974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.626995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.627004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.627019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.627027 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.645704 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.686377 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.725648 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.729171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.729193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.729216 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.729231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.729241 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.764582 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.805353 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.831798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.831854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.831864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.831880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.831891 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.840616 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235" exitCode=0 Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.840704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.844706 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.887738 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.926073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.934126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.934175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.934186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.934223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.934237 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:32Z","lastTransitionTime":"2025-11-28T13:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:32 crc kubenswrapper[4747]: I1128 13:19:32.964459 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:32Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.010277 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.044939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.045443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.045492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.045519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.045536 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.052090 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.091139 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.125642 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.148091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.148129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.148140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.148188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.148217 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.168733 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.206756 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.244991 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.251191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.251318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.251404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.251474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.251555 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.286802 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.318117 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.318426 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.318401926 +0000 UTC m=+33.980883656 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.326220 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.354442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.354509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.354526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.354548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.354564 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.364374 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.405265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.418847 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.418903 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.418937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.418963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.418997 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419067 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.41904791 +0000 UTC m=+34.081529640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419133 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419158 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419172 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419261 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.419240205 +0000 UTC m=+34.081722005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419328 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419371 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419393 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419407 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419372 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.419361678 +0000 UTC m=+34.081843538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.419454 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.419446511 +0000 UTC m=+34.081928241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.444649 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.456845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.456881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.456891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.456913 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.456924 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.484374 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.524377 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.559892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.559948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.559962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.559983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.559996 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.565403 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.606295 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.640785 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.640880 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.640995 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.641054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.641182 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:33 crc kubenswrapper[4747]: E1128 13:19:33.641339 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.649779 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.662408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.662487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.662505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.662534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.662556 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.686355 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.726302 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.766180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.766577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.766691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.766814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.766905 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.771824 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.823267 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.845109 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.850814 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6d63baf-0ac0-4940-bd10-3ca1967456ca" containerID="a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3" exitCode=0 Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.850885 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerDied","Data":"a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.870788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.870848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.870861 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.870884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.870903 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.892789 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.934806 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.966992 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:33Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.974193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.974240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.974249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.974267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:33 crc kubenswrapper[4747]: I1128 13:19:33.974279 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:33Z","lastTransitionTime":"2025-11-28T13:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.009830 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.047488 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.076323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.076357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.076367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.076386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.076404 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.086561 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.125768 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.163549 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.180886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.180931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.180943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.180963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.180976 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.206477 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.244912 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.284630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.284661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.284671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.284690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.284703 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.288293 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.325982 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.369548 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.388070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.388132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.388150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.388175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.388196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.408268 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.459430 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.491242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.491627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.491639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.491654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.491668 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.502264 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.532040 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.566771 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.593820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.593847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.593856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.593872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.593882 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.607433 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.696852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.696913 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.696925 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.696944 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.696957 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.800698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.800785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.800807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.800840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.800863 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.861815 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" event={"ID":"c6d63baf-0ac0-4940-bd10-3ca1967456ca","Type":"ContainerStarted","Data":"127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.869575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.869991 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.890934 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.904450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.904500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.904511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.904528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.904540 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:34Z","lastTransitionTime":"2025-11-28T13:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.913837 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.933336 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.953903 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.975794 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:34 crc kubenswrapper[4747]: I1128 13:19:34.996505 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:34Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.007953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.008002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.008016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.008038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.008053 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.015933 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.038376 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.055658 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.074867 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.093256 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.111390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.111443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.111455 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.111478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.111492 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.119917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.133985 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.152195 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.170157 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.205557 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.215151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.215245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.215267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.215298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.215317 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.284658 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.306603 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.317591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.317641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.317650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.317668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.317681 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.326304 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.365084 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.405387 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.420284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.420327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.420341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.420362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.420376 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.452928 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.499663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.523608 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.523681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.523700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.523725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.523739 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.532625 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.580368 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.607901 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.627313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.627368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.627380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.627402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.627414 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.640973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.641054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.640973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:35 crc kubenswrapper[4747]: E1128 13:19:35.641151 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:35 crc kubenswrapper[4747]: E1128 13:19:35.641285 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:35 crc kubenswrapper[4747]: E1128 13:19:35.641373 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.645531 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.686418 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.735053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.735099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.735109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.735128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.735141 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.750310 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.768868 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.806772 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.839341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.839385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.839398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.839419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.839433 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.872798 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.873238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.897127 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.912447 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.928264 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.939814 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.941527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.941563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.941573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.941590 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.941602 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:35Z","lastTransitionTime":"2025-11-28T13:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:35 crc kubenswrapper[4747]: I1128 13:19:35.969495 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:35Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.011626 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.044148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.044238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.044259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.044288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.044307 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.049604 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.097481 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.127163 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.146819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.146870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.146889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.146919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.146940 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.170355 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.208662 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.246585 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.249523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.249561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.249570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.249589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.249604 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.288321 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.325960 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.352290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.352338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.352348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.352366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.352377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.368502 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.406376 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.455330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.455399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.455413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.455435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.455449 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.558859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.558915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.558935 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.558964 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.558987 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.662588 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.662670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.662694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.662729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.662752 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.765696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.765760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.765778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.765805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.765822 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.868775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.868825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.868840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.868862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.868874 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.879414 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/0.log" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.883142 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023" exitCode=1 Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.883250 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.885456 4747 scope.go:117] "RemoveContainer" containerID="a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.925450 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.952598 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.971712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.971778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.971796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.971821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.971839 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:36Z","lastTransitionTime":"2025-11-28T13:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:36 crc kubenswrapper[4747]: I1128 13:19:36.976564 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.003645 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:36Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.029066 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.054903 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.082156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.082256 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.082279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.082310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.082330 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.083419 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.099438 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.118439 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.136203 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.151441 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.165651 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.186416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.186469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.186481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.186505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.186520 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.193894 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.220044 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.235393 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.289849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.289919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.289937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.289963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.289980 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.393498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.393535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.393544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.393561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.393571 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.497447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.497516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.497536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.497576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.497598 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.600318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.600360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.600372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.600391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.600405 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.640702 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:37 crc kubenswrapper[4747]: E1128 13:19:37.640850 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.641049 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:37 crc kubenswrapper[4747]: E1128 13:19:37.641231 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.641311 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:37 crc kubenswrapper[4747]: E1128 13:19:37.641393 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.668723 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.688523 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.703636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.703675 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.703685 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.703712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.703724 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.704179 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.724148 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.738293 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.749034 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.767715 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.789949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.805802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.805844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.805854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.805872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.805885 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.808827 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.820748 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.833901 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.851664 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.868722 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.889186 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/0.log" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.893024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.893197 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.895701 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.909177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.909238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.909251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.909272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.909285 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:37Z","lastTransitionTime":"2025-11-28T13:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.917304 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.932035 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.944783 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.957861 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.971804 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:37 crc kubenswrapper[4747]: I1128 13:19:37.988583 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:37Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.013068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.013121 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.013133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.013151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.013165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.022702 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.038465 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.060745 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.082181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.098340 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.112630 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.128178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.128244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.128263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.128290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.128308 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.141873 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.166379 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.193895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.216736 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.230794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.230843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.230854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.230877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.230890 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.334254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.334311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.334325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.334348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.334364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.438014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.438108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.438133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.438171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.438194 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.541796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.541867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.541892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.541927 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.541952 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.646745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.646847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.646865 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.646894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.646914 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.750665 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.750740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.750765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.750795 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.750815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.854294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.854381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.854402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.854768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.854993 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.898813 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/1.log" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.899621 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/0.log" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.903239 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" exitCode=1 Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.903284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.903342 4747 scope.go:117] "RemoveContainer" containerID="a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.905091 4747 scope.go:117] "RemoveContainer" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" Nov 28 13:19:38 crc kubenswrapper[4747]: E1128 13:19:38.905448 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.928150 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.959117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.959187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.959199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.959252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.959269 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:38Z","lastTransitionTime":"2025-11-28T13:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.966858 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:38 crc kubenswrapper[4747]: I1128 13:19:38.996270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:38Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.018658 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.049997 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.062469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.062506 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.062517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.062538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.062553 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.072504 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.091831 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.108058 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.119851 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.133361 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.152616 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.163151 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.165075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.165170 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.165190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.165242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.165272 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.179276 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.194510 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.210425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.268662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.268732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.268753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.268787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.268808 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371580 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371665 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371818 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.371815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.408900 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.431047 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.454920 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.470499 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.474647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.474707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.474726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.474755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.474774 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.490165 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.506038 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.526248 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.544277 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.559313 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.578115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.578164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.578176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.578200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.578227 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.588012 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.601983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.620767 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.622253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.622339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.622366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.622403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.622429 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.640250 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.640389 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.640389 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.640518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.640570 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.640786 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.641024 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.640882 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.646971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.647029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.647041 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.647059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.647072 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.661907 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.670917 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.674949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.674995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.675010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.675033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.675049 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.680298 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.689512 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.694632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.694690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.694705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.694728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.694746 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.710605 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.714690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.714745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.714761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.714789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.714807 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.732940 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.733098 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.735764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.735809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.735821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.735840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.735853 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.840019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.840087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.840105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.840135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.840153 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.861418 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269"] Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.862155 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.864696 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.865735 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.888393 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.913503 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/1.log" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.920195 4747 scope.go:117] "RemoveContainer" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" Nov 28 13:19:39 crc kubenswrapper[4747]: E1128 13:19:39.920641 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.935606 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d14889e1684af4048389fdd266513ae71785f8562fa1320054a7c822fb4023\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:36Z\\\",\\\"message\\\":\\\":160\\\\nI1128 13:19:36.182494 6043 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.182952 6043 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.182823 6043 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183174 6043 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.183643 6043 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:36.183740 6043 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1128 13:19:36.183911 6043 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:36.184452 6043 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.943132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.943187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.943231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.943259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.943278 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:39Z","lastTransitionTime":"2025-11-28T13:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.955400 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.976427 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.993805 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.993915 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.993987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.994075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5pd8\" (UniqueName: \"kubernetes.io/projected/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-kube-api-access-s5pd8\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:39 crc kubenswrapper[4747]: I1128 13:19:39.994888 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:39Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.014297 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.045908 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.045971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.045991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.046019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.046038 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.058158 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.095369 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.095450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.095506 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5pd8\" (UniqueName: \"kubernetes.io/projected/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-kube-api-access-s5pd8\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.095562 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.096386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.096757 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.106785 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.123121 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.127425 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5pd8\" (UniqueName: \"kubernetes.io/projected/c35aeb74-fa2c-48de-b112-fb28a1b6d86c-kube-api-access-s5pd8\") pod \"ovnkube-control-plane-749d76644c-t5269\" (UID: \"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.137831 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.148113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.148161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.148173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.148194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.148222 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.153241 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.164417 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.175785 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.185672 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.186321 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: W1128 13:19:40.202965 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35aeb74_fa2c_48de_b112_fb28a1b6d86c.slice/crio-0abbeb68e2f876a3858c123b6492172dbe045c10782c08386dd65e0d0c9c5f6c WatchSource:0}: Error finding container 0abbeb68e2f876a3858c123b6492172dbe045c10782c08386dd65e0d0c9c5f6c: Status 404 returned error can't find the container with id 0abbeb68e2f876a3858c123b6492172dbe045c10782c08386dd65e0d0c9c5f6c Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.203579 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.222158 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.236050 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.248142 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.253343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.253396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.253413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.253436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.253448 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.261744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.273664 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.285441 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.306032 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.344531 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.357124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.357168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.357176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.357193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.357604 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.388101 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.427865 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.460350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.460396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.460406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.460423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.460433 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.469913 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.511362 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.544454 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.563158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.563196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.563219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.563235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.563248 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.592521 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jpqkc"] Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.593246 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: E1128 13:19:40.593320 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.595579 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.627546 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.667534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.667591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.667600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.667623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.667633 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.669986 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.701678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.701762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk94t\" (UniqueName: \"kubernetes.io/projected/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-kube-api-access-hk94t\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.713754 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.754819 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.771326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.771379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.771390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.771408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.771419 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.784634 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.803335 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk94t\" (UniqueName: \"kubernetes.io/projected/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-kube-api-access-hk94t\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.803456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: E1128 13:19:40.803646 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:40 crc kubenswrapper[4747]: E1128 13:19:40.803747 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:19:41.303721708 +0000 UTC m=+33.966203478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.829983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.857896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk94t\" (UniqueName: \"kubernetes.io/projected/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-kube-api-access-hk94t\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.873267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.873457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.873550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.873691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.873806 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.888427 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.925665 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" event={"ID":"c35aeb74-fa2c-48de-b112-fb28a1b6d86c","Type":"ContainerStarted","Data":"07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.925735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" event={"ID":"c35aeb74-fa2c-48de-b112-fb28a1b6d86c","Type":"ContainerStarted","Data":"2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.925755 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" event={"ID":"c35aeb74-fa2c-48de-b112-fb28a1b6d86c","Type":"ContainerStarted","Data":"0abbeb68e2f876a3858c123b6492172dbe045c10782c08386dd65e0d0c9c5f6c"} Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.932275 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.966850 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:40Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.978046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.978126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.978149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.978178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:40 crc kubenswrapper[4747]: I1128 13:19:40.978196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:40Z","lastTransitionTime":"2025-11-28T13:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.005236 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.053372 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.081997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.082074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.082093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.082122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.082142 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.089402 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.134202 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.182509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.187008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.187078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.187096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.187123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.187142 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.208683 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.247766 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.290353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.290432 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.290457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.290491 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.290517 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.294552 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.310860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.311144 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.311310 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:19:42.311278362 +0000 UTC m=+34.973760122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.329967 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.373368 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.394797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.394846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.394859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.394879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.394890 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.411365 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.411460 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:19:57.411440364 +0000 UTC m=+50.073922104 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.413184 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.451789 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.485078 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.497782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.497830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.497843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.497866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.497881 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.512957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.513288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.513399 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513190 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.513506 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513627 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:57.513594688 +0000 UTC m=+50.176076608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513670 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513731 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513765 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513478 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513867 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:57.513832004 +0000 UTC m=+50.176313874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.513924 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:57.513903756 +0000 UTC m=+50.176385766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.514133 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.514261 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.514353 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.514483 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:19:57.514465451 +0000 UTC m=+50.176947191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.529487 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.570599 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.601928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.601993 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.602012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.602042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.602061 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.615355 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.641700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.641779 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.641840 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.641700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.641911 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:41 crc kubenswrapper[4747]: E1128 13:19:41.641936 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.650387 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.690439 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.706160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.706242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.706260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.706280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.706295 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.724684 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.768878 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.808894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.808953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.808973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.809000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.809029 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.818674 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.848590 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.886022 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.912719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.912804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.912827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.912858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.912878 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:41Z","lastTransitionTime":"2025-11-28T13:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.931701 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:41 crc kubenswrapper[4747]: I1128 13:19:41.969876 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:41Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.012631 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:42Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.015562 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.015626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.015651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.015678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.015696 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.055712 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:42Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.100402 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:42Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.119817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.119891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.119912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.119939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.119956 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.133381 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:42Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.222933 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.222978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.222991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.223011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.223024 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.322768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:42 crc kubenswrapper[4747]: E1128 13:19:42.323073 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:42 crc kubenswrapper[4747]: E1128 13:19:42.323244 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:19:44.323186539 +0000 UTC m=+36.985668309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.326439 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.326489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.326506 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.326531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.326548 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.430403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.430474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.430494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.430522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.430540 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.534462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.534521 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.534540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.534571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.534588 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.638502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.638587 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.638605 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.638632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.638650 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.640424 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:42 crc kubenswrapper[4747]: E1128 13:19:42.640741 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.741935 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.742460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.742630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.742767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.742915 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.846873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.846966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.846984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.847011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.847028 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.950808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.950873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.950898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.950930 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:42 crc kubenswrapper[4747]: I1128 13:19:42.950958 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:42Z","lastTransitionTime":"2025-11-28T13:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.053646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.053690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.053703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.053720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.053733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.156417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.156456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.156469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.156489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.156502 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.260081 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.260144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.260163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.260189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.260244 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.363322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.363396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.363421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.363455 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.363477 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.467056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.467604 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.467782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.467940 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.468148 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.572171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.572285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.572306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.572335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.572353 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.641558 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.641622 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:43 crc kubenswrapper[4747]: E1128 13:19:43.641836 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.641904 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:43 crc kubenswrapper[4747]: E1128 13:19:43.642098 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:43 crc kubenswrapper[4747]: E1128 13:19:43.642272 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.674607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.674674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.674692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.674717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.674738 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.778731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.778790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.778807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.778834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.778853 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.882867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.882978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.883002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.883041 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.883062 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.986159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.986329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.986433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.986496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:43 crc kubenswrapper[4747]: I1128 13:19:43.986517 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:43Z","lastTransitionTime":"2025-11-28T13:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.089149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.089248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.089272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.089303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.089328 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.192828 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.192899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.192923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.192997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.193037 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.297003 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.297099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.297125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.297154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.297171 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.350421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:44 crc kubenswrapper[4747]: E1128 13:19:44.350736 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:44 crc kubenswrapper[4747]: E1128 13:19:44.350898 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:19:48.35087246 +0000 UTC m=+41.013354200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.400514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.400586 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.400609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.400634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.400651 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.510361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.510798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.510949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.511102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.511370 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.615179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.615897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.615937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.615966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.615979 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.640971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:44 crc kubenswrapper[4747]: E1128 13:19:44.641124 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.719606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.719686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.719789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.719821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.719839 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.823751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.823822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.823841 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.823871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.823894 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.927531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.927609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.927630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.927660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:44 crc kubenswrapper[4747]: I1128 13:19:44.927680 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:44Z","lastTransitionTime":"2025-11-28T13:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.030593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.030657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.030667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.030689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.030699 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.133765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.133844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.133862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.133892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.133919 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.236747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.236793 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.236809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.236832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.236846 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.339492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.339552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.339572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.339601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.339620 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.443306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.443357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.443368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.443384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.443396 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.547073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.547142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.547160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.547199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.547242 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.641527 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.641599 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.641545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:45 crc kubenswrapper[4747]: E1128 13:19:45.641823 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:45 crc kubenswrapper[4747]: E1128 13:19:45.641945 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:45 crc kubenswrapper[4747]: E1128 13:19:45.642126 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.651503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.651571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.651591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.651619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.651644 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.755367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.755447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.755472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.755505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.755526 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.778967 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.780478 4747 scope.go:117] "RemoveContainer" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" Nov 28 13:19:45 crc kubenswrapper[4747]: E1128 13:19:45.780788 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.859138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.859203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.859252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.859462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.859484 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.963188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.963289 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.963303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.963325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:45 crc kubenswrapper[4747]: I1128 13:19:45.963339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:45Z","lastTransitionTime":"2025-11-28T13:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.065957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.066016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.066033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.066055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.066070 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.169657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.169740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.169767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.169801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.169864 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.273907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.273974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.273993 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.274019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.274038 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.421864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.421990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.422018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.422050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.422075 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.525503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.525560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.525570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.525593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.525604 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.628642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.628709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.628727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.628750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.628763 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.641075 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:46 crc kubenswrapper[4747]: E1128 13:19:46.641260 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.732249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.732314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.732327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.732351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.732366 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.834814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.834883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.834899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.834928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.834945 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.938365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.938446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.938470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.938502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:46 crc kubenswrapper[4747]: I1128 13:19:46.938533 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:46Z","lastTransitionTime":"2025-11-28T13:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.042004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.042067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.042088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.042118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.042140 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.145800 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.145879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.145898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.145926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.145946 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.249612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.249684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.249702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.249728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.249747 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.353316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.353369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.353380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.353459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.353477 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.456557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.456595 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.456604 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.456620 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.456632 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.559796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.559852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.559865 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.559886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.559899 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.641009 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:47 crc kubenswrapper[4747]: E1128 13:19:47.641255 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.641325 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:47 crc kubenswrapper[4747]: E1128 13:19:47.641531 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.641191 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:47 crc kubenswrapper[4747]: E1128 13:19:47.643111 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.662965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.663005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.663016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.663033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.663047 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.671480 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.704355 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.725803 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.747403 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.765665 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.766942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.766991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.767004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.767022 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.767032 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.783536 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.797253 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.817237 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.849499 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.869413 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.870375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.870957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.870979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.871000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.871010 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.884102 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.896729 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.915070 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.930692 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.946767 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.962960 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.973747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.973924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.974026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.974119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.974200 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:47Z","lastTransitionTime":"2025-11-28T13:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:47 crc kubenswrapper[4747]: I1128 13:19:47.983904 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:47Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.077484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.077533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.077545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.077564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.077588 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.180424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.180483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.180502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.180528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.180546 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.283618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.283719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.283746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.283781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.283808 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.386443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.386510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.386528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.386558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.386579 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.442714 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:48 crc kubenswrapper[4747]: E1128 13:19:48.443006 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:48 crc kubenswrapper[4747]: E1128 13:19:48.443129 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:19:56.443096576 +0000 UTC m=+49.105578346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.490876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.490943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.490967 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.490998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.491018 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.594488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.594555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.594573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.594600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.594618 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.640815 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:48 crc kubenswrapper[4747]: E1128 13:19:48.641091 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.697303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.697357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.697368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.697389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.697403 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.800282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.800325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.800334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.800353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.800365 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.903591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.903661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.903689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.903729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:48 crc kubenswrapper[4747]: I1128 13:19:48.903757 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:48Z","lastTransitionTime":"2025-11-28T13:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.007346 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.007422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.007440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.007469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.007487 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.110380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.110444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.110462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.110490 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.110510 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.214000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.214060 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.214076 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.214101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.214120 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.317528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.317581 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.317591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.317612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.317624 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.419820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.419887 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.419906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.419931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.419956 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.522587 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.522980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.523189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.523443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.523577 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.627165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.627262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.627281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.627310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.627328 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.640727 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.640762 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.640842 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:49 crc kubenswrapper[4747]: E1128 13:19:49.640996 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:49 crc kubenswrapper[4747]: E1128 13:19:49.641153 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:49 crc kubenswrapper[4747]: E1128 13:19:49.641261 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.731492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.731865 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.732014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.732290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.732453 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.867866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.867958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.867977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.868010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.868044 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.971006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.971452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.971790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.971920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.972067 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.991749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.991988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.992139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.992297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:49 crc kubenswrapper[4747]: I1128 13:19:49.992389 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:49Z","lastTransitionTime":"2025-11-28T13:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.008582 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:50Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.013695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.013736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.013748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.013768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.013783 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.027075 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:50Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.035828 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.035879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.035891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.035921 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.035936 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.049701 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:50Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.053437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.053474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.053485 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.053507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.053522 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.065771 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:50Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.070414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.070489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.070513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.070542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.070565 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.085421 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:50Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.085543 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.087178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.087230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.087241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.087260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.087271 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.191481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.191751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.191785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.191820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.191844 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.295390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.295532 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.295575 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.295614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.295637 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.399098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.399192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.399267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.399295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.399313 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.503063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.503113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.503122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.503152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.503163 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.606016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.606093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.606114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.606149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.606173 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.640708 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:50 crc kubenswrapper[4747]: E1128 13:19:50.640921 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.710047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.710110 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.710127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.710153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.710170 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.813732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.813789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.813807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.813835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.813853 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.916965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.917026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.917045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.917070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:50 crc kubenswrapper[4747]: I1128 13:19:50.917088 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:50Z","lastTransitionTime":"2025-11-28T13:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.020696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.020747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.020761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.020780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.020792 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.124507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.124953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.125102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.125350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.125521 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.228842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.228906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.228925 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.228950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.228966 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.331813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.332116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.332371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.332408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.332422 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.436143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.436252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.436272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.436296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.436313 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.539900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.539973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.539997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.540034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.540057 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.641068 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.641267 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.641281 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:51 crc kubenswrapper[4747]: E1128 13:19:51.641604 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:51 crc kubenswrapper[4747]: E1128 13:19:51.641758 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:51 crc kubenswrapper[4747]: E1128 13:19:51.641920 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.643655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.643713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.643731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.643757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.643776 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.747166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.747234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.747258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.747282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.747297 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.851151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.851261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.851287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.851318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.851338 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.955543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.955603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.955615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.955639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:51 crc kubenswrapper[4747]: I1128 13:19:51.955653 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:51Z","lastTransitionTime":"2025-11-28T13:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.081910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.082246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.082320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.082390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.082460 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.184829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.185127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.185318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.185403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.185503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.288992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.289284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.289397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.289471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.289538 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.393107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.393178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.393198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.393253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.393272 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.497541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.498043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.498183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.498407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.498572 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.602307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.602713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.602782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.602903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.602990 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.641323 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:52 crc kubenswrapper[4747]: E1128 13:19:52.641585 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.706032 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.706103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.706128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.706166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.706192 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.809328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.809376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.809387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.809405 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.809417 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.912186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.912287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.912308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.912362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:52 crc kubenswrapper[4747]: I1128 13:19:52.912383 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:52Z","lastTransitionTime":"2025-11-28T13:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.015915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.016026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.016054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.016086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.016112 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.120671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.120754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.120782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.120816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.120835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.224194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.224321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.224339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.224371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.224436 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.327568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.327637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.327656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.327682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.327700 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.430616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.430679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.430698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.430726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.430743 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.534613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.534704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.534735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.534773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.534798 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.639246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.639311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.639322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.639343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.639354 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.640604 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.640941 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:53 crc kubenswrapper[4747]: E1128 13:19:53.641024 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:53 crc kubenswrapper[4747]: E1128 13:19:53.641092 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.641150 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:53 crc kubenswrapper[4747]: E1128 13:19:53.641326 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.742196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.742258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.742266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.742281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.742291 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.846103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.846178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.846202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.846298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.846343 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.949379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.949434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.949445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.949467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:53 crc kubenswrapper[4747]: I1128 13:19:53.949721 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:53Z","lastTransitionTime":"2025-11-28T13:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.052839 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.052905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.052917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.052937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.052950 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.156071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.156431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.156524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.156611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.156688 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.260833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.260906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.260928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.260958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.260978 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.365028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.365098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.365115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.365141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.365160 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.469053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.469137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.469151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.469238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.469256 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.573239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.573286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.573298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.573319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.573331 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.640568 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:54 crc kubenswrapper[4747]: E1128 13:19:54.640746 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.676723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.677061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.677254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.677395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.677465 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.780512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.780562 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.780573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.780590 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.780600 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.884638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.884689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.884699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.884721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.884733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.986992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.987414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.987747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.987955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:54 crc kubenswrapper[4747]: I1128 13:19:54.988152 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:54Z","lastTransitionTime":"2025-11-28T13:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.092069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.093102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.093298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.093467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.093764 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.197020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.197091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.197112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.197142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.197165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.300480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.300535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.300549 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.300571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.300586 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.404777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.404839 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.404852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.404873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.404888 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.507616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.507707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.507732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.507767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.507796 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.610893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.610956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.610969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.610990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.611001 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.641200 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.641267 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.641284 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:55 crc kubenswrapper[4747]: E1128 13:19:55.641390 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:55 crc kubenswrapper[4747]: E1128 13:19:55.641481 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:55 crc kubenswrapper[4747]: E1128 13:19:55.641733 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.714182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.714280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.714296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.714320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.714334 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.818185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.818257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.818268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.818287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.818300 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.921556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.921619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.921642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.921673 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:55 crc kubenswrapper[4747]: I1128 13:19:55.921695 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:55Z","lastTransitionTime":"2025-11-28T13:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.025272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.025335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.025359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.025391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.025414 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.128990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.129097 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.129117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.129148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.129166 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.232471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.232603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.232623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.232654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.232671 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.336339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.336447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.336467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.336493 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.336511 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.440158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.440299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.440334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.440367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.440390 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.447802 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:56 crc kubenswrapper[4747]: E1128 13:19:56.448010 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:56 crc kubenswrapper[4747]: E1128 13:19:56.448139 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:20:12.44810188 +0000 UTC m=+65.110583650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.543835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.543897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.543915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.543946 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.543965 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.640460 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:56 crc kubenswrapper[4747]: E1128 13:19:56.640691 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.650058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.650111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.650127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.650153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.650171 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.753829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.753927 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.753945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.753973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.753992 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.856999 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.857068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.857088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.857113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.857130 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.960028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.960150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.960178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.960248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:56 crc kubenswrapper[4747]: I1128 13:19:56.960273 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:56Z","lastTransitionTime":"2025-11-28T13:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.063737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.063919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.063949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.063984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.064007 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.167066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.168327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.168544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.168753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.168949 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.271723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.271785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.271801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.271829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.271847 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.375708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.375760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.375777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.375804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.375823 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.462901 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.463171 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.463122958 +0000 UTC m=+82.125604728 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.479157 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.479258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.479287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.479319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.479341 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.564655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.564726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.564768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.564805 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.564980 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565112 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565152 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565178 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565125 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.565096697 +0000 UTC m=+82.227578507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565360 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.565322933 +0000 UTC m=+82.227804723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565425 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565451 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565472 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565574 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.565550399 +0000 UTC m=+82.228032259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565115 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.565876 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.565858617 +0000 UTC m=+82.228340357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.582952 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.583037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.583065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.583100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.583127 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.640893 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.641730 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.642545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.642601 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.642700 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.643910 4747 scope.go:117] "RemoveContainer" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" Nov 28 13:19:57 crc kubenswrapper[4747]: E1128 13:19:57.642833 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.664555 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.682051 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.686469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.686505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.686516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.686533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.686545 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.700406 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.717663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.737885 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.757895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.776773 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.790193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.790292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.790316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.790347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.790371 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.798800 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.816073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.830049 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.856137 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.889820 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.892900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.892931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.892943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.892987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.893001 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.906677 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.932795 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.960668 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.979317 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.992851 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:57Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.995692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.995722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.995735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.995752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.995763 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:57Z","lastTransitionTime":"2025-11-28T13:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:57 crc kubenswrapper[4747]: I1128 13:19:57.999953 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/1.log" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.002580 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.002973 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.021591 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.031766 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.033485 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.041396 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.046927 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.062681 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.076967 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.096461 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.098335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.098384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.098398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.098419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.098431 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.118517 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.142161 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.156616 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.178484 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.190981 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.200525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.200599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.200614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.200636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.200673 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.201804 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.216709 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.228981 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.242473 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.264573 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.284569 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.300302 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.303040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.303160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.303183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.303225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.303244 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.316038 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.328824 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.345128 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.369636 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.380463 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.390885 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.406673 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.406723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.406734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.406751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.406762 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.410756 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.427834 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.440941 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.463052 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.474826 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.490073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.505189 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.508960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.509018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.509035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.509059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.509077 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.523585 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.536066 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.549293 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.562096 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:58Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.611817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.611895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.611916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.611947 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.611966 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.640773 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:19:58 crc kubenswrapper[4747]: E1128 13:19:58.640986 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.715453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.715505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.715525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.715551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.715568 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.819296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.819649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.819666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.819691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.819710 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.921942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.922008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.922027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.922055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:58 crc kubenswrapper[4747]: I1128 13:19:58.922072 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:58Z","lastTransitionTime":"2025-11-28T13:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.007796 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/2.log" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.008329 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/1.log" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.011352 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" exitCode=1 Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.012637 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:19:59 crc kubenswrapper[4747]: E1128 13:19:59.012795 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.012976 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.013019 4747 scope.go:117] "RemoveContainer" containerID="c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.024698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.024744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.024755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.024773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.024785 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.043864 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.066619 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.085490 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.103744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.117413 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.128932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.128982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.128994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.129015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.129031 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.130834 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.142360 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.154343 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.166918 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.180833 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.194642 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.208602 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.222793 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.231992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.232058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.232075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.232102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.232122 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.240435 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.255949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.266583 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.281082 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.299423 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d091df603554566acc12bce1ee4415cda97979072df843e5f120a887060a59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:38Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.271498 6169 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1128 13:19:38.271520 6169 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1128 13:19:38.271557 6169 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1128 13:19:38.271581 6169 handler.go:208] Removed *v1.Node event handler 2\\\\nI1128 13:19:38.271604 6169 handler.go:208] Removed *v1.Node event handler 7\\\\nI1128 13:19:38.271627 6169 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1128 13:19:38.271650 6169 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1128 13:19:38.272528 6169 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1128 13:19:38.272657 6169 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272751 6169 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.272775 6169 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1128 13:19:38.273074 6169 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:19:59Z is after 2025-08-24T17:21:41Z" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.335381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.335422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.335434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.335453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.335462 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.438726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.438789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.438809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.438834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.438852 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.542395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.542460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.542480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.542504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.542528 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.640838 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.640906 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.640874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:19:59 crc kubenswrapper[4747]: E1128 13:19:59.641066 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:19:59 crc kubenswrapper[4747]: E1128 13:19:59.641249 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:19:59 crc kubenswrapper[4747]: E1128 13:19:59.641478 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.646199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.646290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.646324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.646348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.646365 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.752037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.752147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.752174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.752251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.752293 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.855518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.855595 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.855616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.855645 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.855663 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.959311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.959357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.959368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.959385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:19:59 crc kubenswrapper[4747]: I1128 13:19:59.959395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:19:59Z","lastTransitionTime":"2025-11-28T13:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.018866 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/2.log" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.024106 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.024317 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.054713 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.062761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.062805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.062816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.062830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.062839 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.070299 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.085520 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.099394 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.115650 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.130645 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.146325 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.159582 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.165215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.165249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.165258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.165274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.165285 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.170194 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.181256 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.191501 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.209766 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.221586 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.234356 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.251464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.266547 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.269680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.269743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.269762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.269786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.269803 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.281825 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.307560 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.345665 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.345731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.345755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.345787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.345811 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.369639 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.375148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.375253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.375280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.375315 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.375340 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.391816 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.397457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.397517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.397538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.397556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.397567 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.413522 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.418501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.418559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.418578 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.418607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.418627 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.440699 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.448972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.449058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.449132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.449259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.449325 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.470627 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:00Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.470785 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.472700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.472763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.472789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.472821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.472845 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.575061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.575136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.575160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.575191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.575258 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.641347 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:00 crc kubenswrapper[4747]: E1128 13:20:00.641531 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.677613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.677753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.677776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.677810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.677829 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.781092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.781159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.781185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.781247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.781273 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.884792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.884868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.884894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.884979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.885008 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.987884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.987975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.988010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.988047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:00 crc kubenswrapper[4747]: I1128 13:20:00.988064 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:00Z","lastTransitionTime":"2025-11-28T13:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.092023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.092068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.092087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.092113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.092129 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.195910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.195960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.195976 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.196002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.196020 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.298604 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.298647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.298660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.298680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.298693 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.401150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.401193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.401232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.401252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.401264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.505037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.505121 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.505139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.505175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.505199 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.608398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.608531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.608553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.608583 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.608605 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.641489 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.641587 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.641505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:01 crc kubenswrapper[4747]: E1128 13:20:01.641744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:01 crc kubenswrapper[4747]: E1128 13:20:01.642360 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:01 crc kubenswrapper[4747]: E1128 13:20:01.642088 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.711613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.711672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.711689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.711716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.711734 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.814683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.814761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.814773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.814796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.814811 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.918448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.918584 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.918639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.918669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:01 crc kubenswrapper[4747]: I1128 13:20:01.918698 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:01Z","lastTransitionTime":"2025-11-28T13:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.022577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.022645 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.022662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.022690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.022707 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.126480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.126554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.126580 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.126614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.126640 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.229486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.229543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.229559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.229581 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.229598 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.332062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.332111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.332127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.332150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.332167 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.435124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.435187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.435235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.435264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.435283 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.540071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.540174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.540265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.540343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.540368 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.640597 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:02 crc kubenswrapper[4747]: E1128 13:20:02.640799 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.643048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.643101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.643118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.643143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.643165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.746801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.746859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.746879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.746903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.746922 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.849005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.849054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.849067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.849085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.849099 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.952108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.952148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.952156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.952171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:02 crc kubenswrapper[4747]: I1128 13:20:02.952181 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:02Z","lastTransitionTime":"2025-11-28T13:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.054527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.054601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.054621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.054650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.054670 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.157886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.157957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.157979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.158009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.158031 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.261457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.261513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.261528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.261551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.261566 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.364910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.365056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.365083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.365120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.365143 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.468615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.468662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.468674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.468692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.468707 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.573568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.573617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.573636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.573664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.573684 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.640624 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:03 crc kubenswrapper[4747]: E1128 13:20:03.640821 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.640875 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:03 crc kubenswrapper[4747]: E1128 13:20:03.641038 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.641383 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:03 crc kubenswrapper[4747]: E1128 13:20:03.641629 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.676810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.676874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.676886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.676905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.676915 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.780742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.781165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.781334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.781465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.781625 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.884742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.884806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.884825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.884851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.884872 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.987726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.987806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.987830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.987860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:03 crc kubenswrapper[4747]: I1128 13:20:03.987881 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:03Z","lastTransitionTime":"2025-11-28T13:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.090891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.090981 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.090998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.091027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.091047 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.194575 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.194647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.194671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.194701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.194722 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.298027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.298086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.298103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.298132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.298149 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.401675 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.401863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.401889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.401968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.402053 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.506527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.506603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.506622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.506693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.506717 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.610671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.610726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.610743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.610812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.610836 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.640474 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:04 crc kubenswrapper[4747]: E1128 13:20:04.640653 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.714444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.714500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.714518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.714546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.714562 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.817858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.817932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.817950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.817978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.818001 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.921825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.921902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.921928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.921957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:04 crc kubenswrapper[4747]: I1128 13:20:04.921984 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:04Z","lastTransitionTime":"2025-11-28T13:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.025235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.025304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.025323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.025349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.025367 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.127907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.127981 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.128005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.128031 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.128049 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.231340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.231400 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.231422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.231449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.231471 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.334473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.334568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.334587 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.334614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.334638 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.442758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.443463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.443483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.443511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.443533 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.546375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.546788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.546958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.547119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.547328 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.641010 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.641025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:05 crc kubenswrapper[4747]: E1128 13:20:05.641749 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.641099 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:05 crc kubenswrapper[4747]: E1128 13:20:05.641949 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:05 crc kubenswrapper[4747]: E1128 13:20:05.641598 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.650416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.650466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.650495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.650516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.650533 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.754168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.754273 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.754298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.754333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.754377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.857767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.857826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.857844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.857870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.857890 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.961030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.961128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.961150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.961183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:05 crc kubenswrapper[4747]: I1128 13:20:05.961233 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:05Z","lastTransitionTime":"2025-11-28T13:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.064050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.064138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.064156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.064180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.064197 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.167713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.167806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.167823 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.167853 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.167873 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.271102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.271162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.271184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.271266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.271308 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.375295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.375387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.375413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.375446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.375470 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.478579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.478618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.478630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.478646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.478656 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.581997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.582055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.582072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.582096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.582112 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.641125 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:06 crc kubenswrapper[4747]: E1128 13:20:06.641398 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.685915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.686011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.686034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.686104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.686130 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.789226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.789276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.789293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.789316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.789335 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.892354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.892401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.892413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.892431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.892442 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.995069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.995108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.995119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.995136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:06 crc kubenswrapper[4747]: I1128 13:20:06.995148 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:06Z","lastTransitionTime":"2025-11-28T13:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.098383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.098437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.098449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.098472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.098486 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.201337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.201381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.201392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.201409 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.201420 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.303765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.303814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.303832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.303853 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.303867 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.406619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.406701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.406723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.406744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.406758 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.509167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.509243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.509259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.509280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.509294 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.611525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.611594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.611613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.611642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.611663 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.640968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.641076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.640968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:07 crc kubenswrapper[4747]: E1128 13:20:07.641198 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:07 crc kubenswrapper[4747]: E1128 13:20:07.641316 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:07 crc kubenswrapper[4747]: E1128 13:20:07.641460 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.663749 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.683536 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.700626 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.714971 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.717970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.718046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.718068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.718093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.718141 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.731717 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.756551 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.776938 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.822657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.822742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.822759 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.822807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.822822 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.827275 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.845738 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.867270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.900933 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.920427 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.925180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.925284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.925307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.925368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.925388 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:07Z","lastTransitionTime":"2025-11-28T13:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.942656 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.963363 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.980349 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:07 crc kubenswrapper[4747]: I1128 13:20:07.996674 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:07Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.018119 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:08Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.029133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.029170 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.029180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.029198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.029224 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.034847 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:08Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.132313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.132390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.132403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.132425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.132444 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.235994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.236058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.236079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.236171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.236196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.338761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.338797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.338806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.338823 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.338832 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.441152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.441239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.441253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.441275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.441288 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.544748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.545399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.545423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.545456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.545478 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.640914 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:08 crc kubenswrapper[4747]: E1128 13:20:08.641379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.649335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.649406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.649432 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.649466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.649489 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.751802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.751847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.751858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.751878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.751890 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.855008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.855080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.855097 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.855125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.855145 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.958391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.958482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.958495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.958516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:08 crc kubenswrapper[4747]: I1128 13:20:08.958529 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:08Z","lastTransitionTime":"2025-11-28T13:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.060830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.060897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.060917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.060946 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.060961 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.163541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.163589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.163603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.163623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.163635 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.266577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.266622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.266635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.266652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.266664 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.370926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.370991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.371015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.371047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.371071 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.474838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.474919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.474942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.474976 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.474998 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.578702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.578763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.578782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.578806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.578824 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.640873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.640929 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.640873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:09 crc kubenswrapper[4747]: E1128 13:20:09.641405 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:09 crc kubenswrapper[4747]: E1128 13:20:09.641538 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:09 crc kubenswrapper[4747]: E1128 13:20:09.641181 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.682267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.682318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.682336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.682358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.682376 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.787552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.787628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.787654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.787686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.787710 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.891299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.891377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.891390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.891410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.891423 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.995348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.995414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.995440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.995474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:09 crc kubenswrapper[4747]: I1128 13:20:09.995507 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:09Z","lastTransitionTime":"2025-11-28T13:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.099378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.099449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.099500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.099528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.099547 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.202991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.203062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.203088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.203120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.203170 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.307803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.307908 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.307933 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.308018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.308047 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.412385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.412470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.412498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.412538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.412563 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.516056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.516116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.516132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.516156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.516171 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.619100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.619131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.619140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.619154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.619163 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.640660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.640830 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.726342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.726425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.726456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.726491 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.726515 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.770883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.771015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.771037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.771059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.771112 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.794126 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:10Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.799420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.799507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.799529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.799551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.799566 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.830407 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:10Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.835108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.835150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.835164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.835185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.835198 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.847795 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:10Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.852057 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.852111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.852122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.852144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.852157 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.867608 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:10Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.872199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.872276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.872296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.872321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.872339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.889455 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:10Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:10 crc kubenswrapper[4747]: E1128 13:20:10.889624 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.891339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.891388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.891401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.891423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.891436 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.994235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.994307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.994325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.994349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:10 crc kubenswrapper[4747]: I1128 13:20:10.994366 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:10Z","lastTransitionTime":"2025-11-28T13:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.096978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.097049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.097066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.097092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.097109 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.200366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.200411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.200422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.200442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.200456 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.303658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.303779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.303805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.303840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.303865 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.406568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.406613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.406624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.406642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.406653 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.509792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.509870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.509894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.509932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.509957 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.612785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.612841 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.612854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.612876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.612888 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.641316 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.641442 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.641512 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:11 crc kubenswrapper[4747]: E1128 13:20:11.641720 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:11 crc kubenswrapper[4747]: E1128 13:20:11.641869 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:11 crc kubenswrapper[4747]: E1128 13:20:11.641984 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.715717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.715763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.715775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.715794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.715807 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.818679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.818737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.818750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.818770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.818783 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.921028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.921080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.921092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.921112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:11 crc kubenswrapper[4747]: I1128 13:20:11.921125 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:11Z","lastTransitionTime":"2025-11-28T13:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.023473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.023515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.023524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.023540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.023549 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.126801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.126846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.126857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.126876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.126887 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.230476 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.230567 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.230594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.230629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.230654 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.333434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.333477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.333491 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.333509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.333520 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.436494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.436613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.436639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.436672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.436700 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.540642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.540705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.540727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.540763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.540785 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.541785 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:12 crc kubenswrapper[4747]: E1128 13:20:12.542037 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:12 crc kubenswrapper[4747]: E1128 13:20:12.542135 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs podName:8dd5d3d3-f6f3-48da-8e99-2e16fd81582f nodeName:}" failed. No retries permitted until 2025-11-28 13:20:44.542103822 +0000 UTC m=+97.204585582 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs") pod "network-metrics-daemon-jpqkc" (UID: "8dd5d3d3-f6f3-48da-8e99-2e16fd81582f") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.640438 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:12 crc kubenswrapper[4747]: E1128 13:20:12.640698 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.644517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.644600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.644623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.644661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.644684 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.748009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.748071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.748094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.748120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.748142 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.852350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.852431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.852454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.852489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.852514 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.956030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.956088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.956109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.956135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:12 crc kubenswrapper[4747]: I1128 13:20:12.956153 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:12Z","lastTransitionTime":"2025-11-28T13:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.058899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.058963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.059013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.059040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.059057 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.161965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.162010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.162026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.162047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.162061 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.264815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.264850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.264860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.264875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.264884 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.367776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.367839 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.367855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.367879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.367896 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.471516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.471580 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.471592 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.471612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.471625 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.574512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.574563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.574573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.574590 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.574599 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.641349 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.641434 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.641545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:13 crc kubenswrapper[4747]: E1128 13:20:13.641663 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:13 crc kubenswrapper[4747]: E1128 13:20:13.641875 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:13 crc kubenswrapper[4747]: E1128 13:20:13.641939 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.677013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.677089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.677116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.677149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.677171 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.779381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.779422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.779430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.779446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.779455 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.882291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.882357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.882377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.882404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.882425 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.986694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.986755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.986773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.986798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:13 crc kubenswrapper[4747]: I1128 13:20:13.986815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:13Z","lastTransitionTime":"2025-11-28T13:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.084036 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/0.log" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.084183 4747 generic.go:334] "Generic (PLEG): container finished" podID="11d91e3e-309b-4e83-9b0c-1f589c7670f6" containerID="65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df" exitCode=1 Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.084275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerDied","Data":"65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.084917 4747 scope.go:117] "RemoveContainer" containerID="65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.088848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.088885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.088902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.088925 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.088964 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.112132 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.140119 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.158277 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.176141 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.191475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.191502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.191511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.191525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.191536 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.201023 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.213588 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.228180 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.259150 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.278856 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.293729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.293781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.293792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.293808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.293820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.298879 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.319509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"2025-11-28T13:19:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457\\\\n2025-11-28T13:19:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457 to /host/opt/cni/bin/\\\\n2025-11-28T13:19:28Z [verbose] multus-daemon started\\\\n2025-11-28T13:19:28Z [verbose] Readiness Indicator file check\\\\n2025-11-28T13:20:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.333720 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.351695 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.367128 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.377882 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.393644 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.397846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.397884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.397895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.397911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.397922 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.407090 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.420409 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:14Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.499708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.499746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.499775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.499790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.499799 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.602184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.602283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.602306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.602337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.602371 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.640561 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:14 crc kubenswrapper[4747]: E1128 13:20:14.640734 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.641479 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:20:14 crc kubenswrapper[4747]: E1128 13:20:14.641717 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.705419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.705496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.705514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.705540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.705561 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.808375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.808418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.808429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.808450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.808461 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.911244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.911317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.911341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.911371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4747]: I1128 13:20:14.911394 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.014031 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.014089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.014106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.014130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.014147 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.089533 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/0.log" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.089587 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerStarted","Data":"9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.107640 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.116169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.116240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.116254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.116272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.116284 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.125164 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.142417 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.159727 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.171639 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.181958 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.194068 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.205626 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.218662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.218691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.218703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.218719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.218730 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.224025 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.239720 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.257492 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.279915 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.306609 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.318649 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.322703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.322749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.322762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.322782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.322795 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.345659 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.362990 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.377876 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.393030 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"2025-11-28T13:19:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457\\\\n2025-11-28T13:19:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457 to /host/opt/cni/bin/\\\\n2025-11-28T13:19:28Z [verbose] multus-daemon started\\\\n2025-11-28T13:19:28Z [verbose] Readiness Indicator file check\\\\n2025-11-28T13:20:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:15Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.425691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.425768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.425792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.425825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.425847 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.529431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.529486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.529497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.529516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.529529 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.632130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.632194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.632225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.632251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.632264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.640412 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.640461 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.640412 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:15 crc kubenswrapper[4747]: E1128 13:20:15.640552 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:15 crc kubenswrapper[4747]: E1128 13:20:15.640628 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:15 crc kubenswrapper[4747]: E1128 13:20:15.640693 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.735899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.735980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.736001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.736026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.736046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.838412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.838507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.838522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.838544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.838557 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.941362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.941421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.941439 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.941463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:15 crc kubenswrapper[4747]: I1128 13:20:15.941481 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:15Z","lastTransitionTime":"2025-11-28T13:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.044840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.044891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.044900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.044915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.044924 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.148462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.148511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.148523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.148542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.148553 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.252279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.252356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.252374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.252429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.252449 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.355188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.355242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.355253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.355268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.355278 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.458402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.458461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.458477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.458496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.458508 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.561350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.561438 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.561465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.561498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.561521 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.640719 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:16 crc kubenswrapper[4747]: E1128 13:20:16.640959 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.665083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.665143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.665161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.665195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.665252 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.769998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.770138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.770161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.770187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.770284 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.873938 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.874039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.874064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.874104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.874130 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.976975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.977042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.977063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.977097 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:16 crc kubenswrapper[4747]: I1128 13:20:16.977121 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:16Z","lastTransitionTime":"2025-11-28T13:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.079643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.079713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.079731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.079763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.079788 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.184937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.184997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.185015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.185048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.185068 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.287618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.287668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.287679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.287697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.287709 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.391399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.391477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.391496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.391523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.391543 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.494498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.494551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.494564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.494584 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.494597 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.598152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.598260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.598278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.598300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.598313 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.640767 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.640836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.640784 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:17 crc kubenswrapper[4747]: E1128 13:20:17.641081 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:17 crc kubenswrapper[4747]: E1128 13:20:17.641155 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:17 crc kubenswrapper[4747]: E1128 13:20:17.640992 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.670406 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab7aa166-a678-402e-9300-6b1a315be4f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcdb90dba2fb8b4c337d6a8eee3a709dd1cb5698b25e9b4c0a28a720aa5333c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f25e1d359634277fd506dd61ad4b2fee1702859dfdee0a92f65681a5584468ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39ea8534ce439500c572d954ca3a45693e0ef967eefce1b59f9d0f3ab39416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9636e2dd8331a8d9d7a871299b6fecc1c5ec888e65a67d627823ad2e2451b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0220be6f17d038776578890ece6b64c19a9b3f0c69162b696eb47d52fecbe782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263a738710ef657f56aa031a2aa220dd104f272529445aff5b05ae5adbebe6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01b38175ae8b1cdc75282c4bdb1371d04c0027486be93929c1cb374b421a0f47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46879e439fea7807ef1b10357fd8178bb8831a0b34bfede3eda9ad1c1c8c8c5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.690460 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d391fd36-c358-4fee-a17d-be8d17fab092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bf69ba8550f390154f8a8a91427e51c9d0e112c1650bd267d8d16a04d0ca8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c842367916f22ccf6e3da613c9341932acbdd61e79fbc829f3f8a2a7d4c9827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68ce76903912c9581ec9b9aea95362afeee07a8051934b596cffbea89e9cdf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.701347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.701388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.701401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.701417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.701428 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.707344 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://445c4f530d62435b3063ae207ba2b42a7980e99e15ece66116e9cb49b072de8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.725334 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-78psz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d91e3e-309b-4e83-9b0c-1f589c7670f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"2025-11-28T13:19:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457\\\\n2025-11-28T13:19:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e4ca79d0-3c34-4470-b480-310bfa98e457 to /host/opt/cni/bin/\\\\n2025-11-28T13:19:28Z [verbose] multus-daemon started\\\\n2025-11-28T13:19:28Z [verbose] Readiness Indicator file check\\\\n2025-11-28T13:20:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8kphz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-78psz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.741942 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.761464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.776736 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.789525 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.803969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.804959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.805196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.805433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.805641 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.805825 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c86df88-b40a-42cb-98e4-71d02b1613bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4713bbd6d720a3387fab4c276e9bed1f9535163fda472b255b663e5a3482e003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef8d737d6947a88ce0b5c9498806d792272660024b179adf3138b32385c4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcfc579e13e59d71a898ac62fdb96416592cc1340c9f0cabd53852293ef007f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3d6452835ee4883e25ada6b00163573287d3e8a100bffd89824a6f1b6a06a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.818994 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe6fa425bef314ee904882a3e49c9fb1edc2e28bdf2ae553f5aae3aef469d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.833716 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc55136c-24a8-4913-b8b9-afe93e54fd83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://613c4c2c6ac2e237032ca42a8d3486d3b36e7d6b741c4443e8ef9c261558df2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zbzpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.846882 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496bd000-e0cc-4af2-a1a6-04f392e21371\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"ormer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1128 13:19:25.523307 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523339 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1128 13:19:25.523447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1128 13:19:25.523451 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1128 13:19:25.523462 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1128 13:19:25.523469 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1128 13:19:25.523643 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-396585074/tls.crt::/tmp/serving-cert-396585074/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764335949\\\\\\\\\\\\\\\" (2025-11-28 13:19:09 +0000 UTC to 2025-12-28 13:19:10 +0000 UTC (now=2025-11-28 13:19:25.523566725 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524017 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764335965\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764335965\\\\\\\\\\\\\\\" (2025-11-28 12:19:25 +0000 UTC to 2026-11-28 12:19:25 +0000 UTC (now=2025-11-28 13:19:25.523989226 +0000 UTC))\\\\\\\"\\\\nI1128 13:19:25.524044 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1128 13:19:25.524068 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF1128 13:19:25.523645 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.861270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.871283 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9h2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533d335-7419-4f71-857b-2dbf2274a2cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e755270156c58bdc3583525106ae15460d012f25837085ab36a3891f5e660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pht9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9h2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.887602 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6sv29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d63baf-0ac0-4940-bd10-3ca1967456ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://127e68431d30aa8fe4e871e0253a13890eac73402ff21779bfb770106439bed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a870921eec50ef3a4de1bf61245d3814a3aec15a1367d71c3fbe0946762456f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e19320d2f4080fe8deb6016a7d0a979a75f510de2a0aaa944f1dad6fdd23ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d21a9b10044dd270cfabc46263560f6d52520911867a2e533c41cef38462f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dfe2cfb9cbf5fc942bafbb5eab85f8e321ec08c575edd6e3b4fa92e320ff170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df018b1ae4b0129120f4bfe43d85c7d1bb3b053194a1dff5675c941c4f1c8235\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7618fce9adb6fad0ff6765756ead314a7592aa63f544d0382caeaf7e6fe7ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6gw4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6sv29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.908023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.908068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.908086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.908112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.908129 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:17Z","lastTransitionTime":"2025-11-28T13:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.918437 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a52417df-b828-4251-a786-afae5d1aa9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-28T13:19:58Z\\\",\\\"message\\\":\\\"es.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1128 13:19:58.558397 6404 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558400 6404 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1128 13:19:58.558405 6404 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI1128 13:19:58.558199 6404 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1128 13:19:58.558411 6404 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1128 13:19:58.558056 6404 services_controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1128 13:19:58.558461 6404 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hb2vp_openshift-ovn-kubernetes(a52417df-b828-4251-a786-afae5d1aa9fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rlqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hb2vp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.933983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lh4kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0fe5b7-fb51-4e74-b376-416fc0c9c9ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca77bdbfe068b935478459ce0761413d8dca42b6b497ff636824f8167320de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9h9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lh4kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:17 crc kubenswrapper[4747]: I1128 13:20:17.944685 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpqkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:17Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.010537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.010597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.010616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.010642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.010660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.112850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.112939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.112965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.113002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.113027 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.216746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.216817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.216834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.216860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.216876 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.319715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.319797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.319817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.319844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.319862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.423131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.423581 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.423668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.423760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.423855 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.527236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.527650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.527851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.528063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.528304 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.630977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.631418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.631460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.631494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.631518 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.641593 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:18 crc kubenswrapper[4747]: E1128 13:20:18.641826 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.734619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.734669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.734687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.734708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.734723 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.837681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.837753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.837772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.837799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.837819 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.940397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.940479 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.940498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.941041 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:18 crc kubenswrapper[4747]: I1128 13:20:18.941108 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:18Z","lastTransitionTime":"2025-11-28T13:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.044945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.045088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.045114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.045152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.045176 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.147646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.147706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.147722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.147746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.147764 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.251152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.251277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.251302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.251333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.251355 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.354094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.354172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.354251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.354291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.354313 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.458673 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.458722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.458733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.458756 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.458770 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.562421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.562501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.562523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.562553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.562572 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.641200 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.641336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:19 crc kubenswrapper[4747]: E1128 13:20:19.641350 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.641198 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:19 crc kubenswrapper[4747]: E1128 13:20:19.641553 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:19 crc kubenswrapper[4747]: E1128 13:20:19.641773 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.665968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.666009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.666021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.666039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.666050 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.769161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.769536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.769612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.769722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.769746 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.872812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.872866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.872885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.872911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.872928 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.977523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.977743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.977807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.977840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:19 crc kubenswrapper[4747]: I1128 13:20:19.977861 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:19Z","lastTransitionTime":"2025-11-28T13:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.080662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.080724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.080737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.080758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.080771 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.183178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.183251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.183262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.183281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.183292 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.285531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.285582 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.285594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.285613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.285624 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.388736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.388785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.388800 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.388819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.388835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.492140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.492231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.492249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.492278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.492299 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.595415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.595478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.595496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.595524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.595542 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.641165 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:20 crc kubenswrapper[4747]: E1128 13:20:20.641454 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.698897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.698970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.698987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.699016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.699037 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.803194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.803291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.803309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.803336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.803357 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.907116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.907194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.907258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.907294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:20 crc kubenswrapper[4747]: I1128 13:20:20.907319 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:20Z","lastTransitionTime":"2025-11-28T13:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.011104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.011173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.011195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.011272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.011298 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.114110 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.114173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.114191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.114239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.114257 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.161354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.161409 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.161426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.161451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.161476 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.181779 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:21Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.193045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.193130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.193151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.193184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.193250 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.217759 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:21Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.223713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.223775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.223796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.223825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.223846 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.245059 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:21Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.250465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.250509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.250529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.250553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.250572 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.270172 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:21Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.274358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.274401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.274418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.274443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.274460 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.295990 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b96c0c46-5b5f-49cf-b534-641e4124214f\\\",\\\"systemUUID\\\":\\\"4871ee14-35cb-4f3f-af5a-3f1522596ec5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:21Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.296251 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.298994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.299082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.299100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.299129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.299145 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.403260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.403309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.403326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.403349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.403367 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.507384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.507443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.507461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.507487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.507505 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.611459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.611546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.611572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.611607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.611630 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.640818 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.641085 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.641436 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.641554 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.641848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:21 crc kubenswrapper[4747]: E1128 13:20:21.641951 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.714906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.714986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.715010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.715040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.715062 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.818470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.818546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.818564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.818593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.818672 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.923189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.923286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.923309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.923337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:21 crc kubenswrapper[4747]: I1128 13:20:21.923355 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:21Z","lastTransitionTime":"2025-11-28T13:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.027456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.027539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.027564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.027595 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.027621 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.130740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.130796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.130813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.130840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.130859 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.234610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.234682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.234699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.234726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.234746 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.338375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.338435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.338454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.338479 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.338503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.442127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.442197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.442255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.442285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.442307 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.545613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.545673 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.545696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.545723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.545742 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.641433 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:22 crc kubenswrapper[4747]: E1128 13:20:22.641643 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.648789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.648848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.648869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.648896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.648916 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.752555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.752606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.752627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.752653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.752671 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.855108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.855175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.855186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.855217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.855227 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.957655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.958063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.958182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.958341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:22 crc kubenswrapper[4747]: I1128 13:20:22.958455 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:22Z","lastTransitionTime":"2025-11-28T13:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.061801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.061871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.061896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.061927 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.061951 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.164676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.164860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.164884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.164911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.164932 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.268492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.268580 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.268614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.268645 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.268670 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.371559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.371618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.371638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.371665 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.371681 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.474762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.474796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.474806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.474831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.474844 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.578020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.578118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.578134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.578168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.578190 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.640650 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.640741 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:23 crc kubenswrapper[4747]: E1128 13:20:23.640941 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.640977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:23 crc kubenswrapper[4747]: E1128 13:20:23.641544 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:23 crc kubenswrapper[4747]: E1128 13:20:23.641650 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.681447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.681492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.681507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.681562 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.681576 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.784740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.784816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.784844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.784877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.784903 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.888067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.889502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.889722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.889945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.890164 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.994259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.994324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.994354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.994389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:23 crc kubenswrapper[4747]: I1128 13:20:23.994414 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:23Z","lastTransitionTime":"2025-11-28T13:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.097172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.097284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.097310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.097339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.097356 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.200919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.201001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.201024 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.201055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.201079 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.304435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.304504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.304523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.304550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.304567 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.406531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.406835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.406913 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.406982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.407046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.509652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.509988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.510130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.510290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.510410 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.613684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.613770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.613798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.613825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.613842 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.640545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:24 crc kubenswrapper[4747]: E1128 13:20:24.640903 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.731519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.731605 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.731628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.731663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.731688 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.834817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.834898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.834912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.834936 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.834948 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.939050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.939127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.939147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.939177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:24 crc kubenswrapper[4747]: I1128 13:20:24.939196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:24Z","lastTransitionTime":"2025-11-28T13:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.043142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.043264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.043292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.043325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.043349 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.146355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.146427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.146444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.146474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.146492 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.250042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.250096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.250106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.250124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.250136 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.353795 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.353867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.353892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.353923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.353943 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.458023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.458094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.458113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.458140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.458160 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.561806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.561863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.561881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.561906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.561924 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.641263 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.641341 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:25 crc kubenswrapper[4747]: E1128 13:20:25.641420 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.641267 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:25 crc kubenswrapper[4747]: E1128 13:20:25.641577 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:25 crc kubenswrapper[4747]: E1128 13:20:25.641656 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.664543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.664619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.664634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.664654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.664668 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.768309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.768363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.768389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.768415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.768433 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.872116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.872176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.872194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.872263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.872301 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.974726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.974805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.974829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.974859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:25 crc kubenswrapper[4747]: I1128 13:20:25.974880 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:25Z","lastTransitionTime":"2025-11-28T13:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.078697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.079149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.079167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.079197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.079286 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.183313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.183401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.183424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.183458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.183484 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.286340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.286412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.286431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.286456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.286472 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.389642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.389706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.389724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.389755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.389776 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.492641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.492692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.492701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.492719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.492730 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.596461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.596541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.596561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.596593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.596613 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.640660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:26 crc kubenswrapper[4747]: E1128 13:20:26.640878 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.700679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.700755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.700776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.700803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.700824 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.804261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.804301 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.804314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.804334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.804349 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.908377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.908410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.908419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.908435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:26 crc kubenswrapper[4747]: I1128 13:20:26.908443 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:26Z","lastTransitionTime":"2025-11-28T13:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.012407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.012504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.012530 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.012564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.012586 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.116244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.116288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.116305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.116343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.116360 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.219869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.219931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.219949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.219975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.219992 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.322721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.322822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.322841 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.322871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.322892 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.429113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.429240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.429285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.429307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.429321 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.532526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.532593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.532614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.532641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.532659 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.635321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.635777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.635982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.636181 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.636403 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.641360 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:27 crc kubenswrapper[4747]: E1128 13:20:27.641551 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.641892 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:27 crc kubenswrapper[4747]: E1128 13:20:27.642029 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.642275 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:27 crc kubenswrapper[4747]: E1128 13:20:27.642880 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.666346 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f7d79f57a36df40fe98a3c1081dca5723a6498ca859c98859989ed6baf57e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d928f5f5c2bf637c230b3d4cd1a501453aa729d1acbaf54b254781a970da7ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.686409 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c35aeb74-fa2c-48de-b112-fb28a1b6d86c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aa2e387dc4b69e508df9dde975374ed8aa2ea73529f5e47c9e88188cf4a1dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07d33dce38654436f3efce02ae88a0c292cc3135b2ced18e3c5f82b13063a737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s5pd8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t5269\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.707386 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.729601 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:27Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.739247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.739333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.739351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.739379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.739426 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.803079 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podStartSLOduration=61.803048574 podStartE2EDuration="1m1.803048574s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.7811341 +0000 UTC m=+80.443615840" watchObservedRunningTime="2025-11-28 13:20:27.803048574 +0000 UTC m=+80.465530344" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.803335 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.803326222 podStartE2EDuration="29.803326222s" podCreationTimestamp="2025-11-28 13:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.802537011 +0000 UTC m=+80.465018841" watchObservedRunningTime="2025-11-28 13:20:27.803326222 +0000 UTC m=+80.465807992" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.822123 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t9h2n" podStartSLOduration=61.822099811 podStartE2EDuration="1m1.822099811s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.82207395 +0000 UTC m=+80.484555690" watchObservedRunningTime="2025-11-28 13:20:27.822099811 +0000 UTC m=+80.484581591" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.842178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.842278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.842309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.842345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.842369 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.882426 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6sv29" podStartSLOduration=61.882399596 podStartE2EDuration="1m1.882399596s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.853489142 +0000 UTC m=+80.515970882" watchObservedRunningTime="2025-11-28 13:20:27.882399596 +0000 UTC m=+80.544881336" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.907527 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lh4kn" podStartSLOduration=61.907492116 podStartE2EDuration="1m1.907492116s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.895517771 +0000 UTC m=+80.557999531" watchObservedRunningTime="2025-11-28 13:20:27.907492116 +0000 UTC m=+80.569973886" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.925335 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=62.925311239 podStartE2EDuration="1m2.925311239s" podCreationTimestamp="2025-11-28 13:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.924653741 +0000 UTC m=+80.587135491" watchObservedRunningTime="2025-11-28 13:20:27.925311239 +0000 UTC m=+80.587792989" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.945033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.945481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.945617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.945749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.945862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:27Z","lastTransitionTime":"2025-11-28T13:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.998903 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=62.998879844 podStartE2EDuration="1m2.998879844s" podCreationTimestamp="2025-11-28 13:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.997183488 +0000 UTC m=+80.659665228" watchObservedRunningTime="2025-11-28 13:20:27.998879844 +0000 UTC m=+80.661361574" Nov 28 13:20:27 crc kubenswrapper[4747]: I1128 13:20:27.999077 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-78psz" podStartSLOduration=61.999070899 podStartE2EDuration="1m1.999070899s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:27.968701886 +0000 UTC m=+80.631183626" watchObservedRunningTime="2025-11-28 13:20:27.999070899 +0000 UTC m=+80.661552629" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.012119 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=56.012094972 podStartE2EDuration="56.012094972s" podCreationTimestamp="2025-11-28 13:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:28.011802354 +0000 UTC m=+80.674284094" watchObservedRunningTime="2025-11-28 13:20:28.012094972 +0000 UTC m=+80.674576702" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.048030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.048083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.048093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.048110 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.048120 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.151122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.151256 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.151333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.151422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.151452 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.254508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.254591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.254611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.254638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.254656 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.358488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.358568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.358592 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.358624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.358662 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.462072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.462156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.462182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.462246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.462274 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.565498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.565556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.565576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.565601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.565620 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.640598 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:28 crc kubenswrapper[4747]: E1128 13:20:28.640962 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.658429 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.669009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.669058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.669078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.669101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.669118 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.773101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.773160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.773177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.773203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.773249 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.877412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.878520 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.878571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.878598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.878611 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.989519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.990349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.990387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.990416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:28 crc kubenswrapper[4747]: I1128 13:20:28.990434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:28Z","lastTransitionTime":"2025-11-28T13:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.092712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.093561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.093758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.093903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.094046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.196631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.196696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.196716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.196741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.196760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.300287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.300372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.300397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.300427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.300450 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.403650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.403752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.403775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.403803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.403820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.506601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.507007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.507147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.507313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.507454 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.548376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.548662 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:21:33.548604421 +0000 UTC m=+146.211086211 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.610660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.610712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.610730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.610754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.610772 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.641317 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.641505 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.641516 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.641636 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.641700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.642324 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.642899 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.650494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.650585 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.650651 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.650708 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.650763 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.650877 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.650902 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:21:33.650864134 +0000 UTC m=+146.313346064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.650974 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651033 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651035 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:21:33.651001987 +0000 UTC m=+146.313483907 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651056 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651066 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651120 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651143 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651144 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:21:33.651120121 +0000 UTC m=+146.313602021 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:29 crc kubenswrapper[4747]: E1128 13:20:29.651306 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:21:33.651280565 +0000 UTC m=+146.313762355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.713702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.713760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.713779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.713806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.713822 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.816613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.816676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.816695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.816723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.816742 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.920430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.920907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.920919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.920937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:29 crc kubenswrapper[4747]: I1128 13:20:29.920950 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:29Z","lastTransitionTime":"2025-11-28T13:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.023689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.023742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.023753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.023772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.023786 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.130934 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.130984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.130998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.131018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.131030 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.146410 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/2.log" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.149584 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerStarted","Data":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.150128 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.165165 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t5269" podStartSLOduration=64.165142617 podStartE2EDuration="1m4.165142617s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:30.164120499 +0000 UTC m=+82.826602230" watchObservedRunningTime="2025-11-28 13:20:30.165142617 +0000 UTC m=+82.827624347" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.234354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.234422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.234438 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.234462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.234481 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.269972 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.269950299 podStartE2EDuration="2.269950299s" podCreationTimestamp="2025-11-28 13:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:30.266457714 +0000 UTC m=+82.928939444" watchObservedRunningTime="2025-11-28 13:20:30.269950299 +0000 UTC m=+82.932432049" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.316924 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podStartSLOduration=64.316907111 podStartE2EDuration="1m4.316907111s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:30.315925194 +0000 UTC m=+82.978406924" watchObservedRunningTime="2025-11-28 13:20:30.316907111 +0000 UTC m=+82.979388841" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.338378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.338426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.338436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.338454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.338466 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.441017 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.441051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.441059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.441074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.441083 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.543997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.544044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.544053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.544071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.544086 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.590415 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpqkc"] Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.590564 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:30 crc kubenswrapper[4747]: E1128 13:20:30.590652 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.646061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.646108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.646119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.646136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.646146 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.748431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.748476 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.748486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.748503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.748515 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.850477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.850514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.850523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.850538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.850546 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.953074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.953116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.953129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.953147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:30 crc kubenswrapper[4747]: I1128 13:20:30.953159 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:30Z","lastTransitionTime":"2025-11-28T13:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.055716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.055783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.055806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.055830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.055846 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.158657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.158692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.158700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.158714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.158722 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.262523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.262584 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.262600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.262631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.262655 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.366619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.366701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.366727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.366765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.366791 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.470725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.471402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.471430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.471464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.471491 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.573188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.573250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.573261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.573276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.573286 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.640611 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.640695 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:31 crc kubenswrapper[4747]: E1128 13:20:31.640753 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpqkc" podUID="8dd5d3d3-f6f3-48da-8e99-2e16fd81582f" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.640799 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:31 crc kubenswrapper[4747]: E1128 13:20:31.640857 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:31 crc kubenswrapper[4747]: E1128 13:20:31.640957 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.641033 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:31 crc kubenswrapper[4747]: E1128 13:20:31.641110 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.675970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.676025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.676034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.676051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.676061 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.677029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.677073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.677085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.677105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.677122 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:31Z","lastTransitionTime":"2025-11-28T13:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.761689 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8"] Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.762537 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: W1128 13:20:31.764594 4747 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Nov 28 13:20:31 crc kubenswrapper[4747]: E1128 13:20:31.764677 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.764957 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.765732 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.765765 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.872910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.872974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.872998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.873021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.873061 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974571 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974631 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974680 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974719 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.974923 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.976977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:31 crc kubenswrapper[4747]: I1128 13:20:31.983586 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:32 crc kubenswrapper[4747]: I1128 13:20:32.003138 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jm2h8\" (UID: \"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.087596 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" secret="" err="failed to sync secret cache: timed out waiting for the condition" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.087706 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" Nov 28 13:20:33 crc kubenswrapper[4747]: W1128 13:20:33.112077 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79e5c0e_cf95_4dfb_a3e8_74dc9c7116ad.slice/crio-fc5f3c36fdf3e03e70747897be4a3cdfc09e68619a14bb8554261dcfb256366b WatchSource:0}: Error finding container fc5f3c36fdf3e03e70747897be4a3cdfc09e68619a14bb8554261dcfb256366b: Status 404 returned error can't find the container with id fc5f3c36fdf3e03e70747897be4a3cdfc09e68619a14bb8554261dcfb256366b Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.125864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.126022 4747 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.166912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" event={"ID":"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad","Type":"ContainerStarted","Data":"fc5f3c36fdf3e03e70747897be4a3cdfc09e68619a14bb8554261dcfb256366b"} Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.188265 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pl7n"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.189143 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.189703 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jdt8x"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.190529 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.191231 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.191668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.192986 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hlq5"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.193455 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.194987 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.195332 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nq9nc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.196000 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.201500 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.201802 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.201884 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.201510 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.202294 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.206734 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.208186 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49gpd"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.208756 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.208999 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.209610 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.209837 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.211178 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sf6rs"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.212009 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.212655 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.212872 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.213058 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.213270 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.213409 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.213519 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.213883 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214055 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214242 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214525 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214639 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214830 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.214960 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.215056 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.215197 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.215372 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.215502 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.215599 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.231366 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.232441 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.232506 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.234616 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.235094 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-66kvx"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.236491 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.236478 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.240174 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.247041 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-cjg4p"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.248068 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.254467 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.254554 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.254916 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.254953 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255154 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255301 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255635 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255736 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255908 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255959 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.255990 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.256035 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.256629 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.257075 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.257764 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.258481 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.258973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.263381 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.264485 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.265028 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.265912 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-z6ppk"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.266405 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-c6vdt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.266699 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.266953 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.269748 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.270034 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.270109 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.270304 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.270410 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273172 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273387 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273464 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273525 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273662 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273750 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273756 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273846 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273904 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.273944 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274022 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274066 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274180 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274250 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274341 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274415 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274492 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.274576 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.275630 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.275749 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.275889 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.275987 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.276059 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.276196 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.276306 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.276386 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.276461 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.277592 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.277813 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.277943 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.280473 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.280994 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.283469 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.283853 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.283974 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.284197 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.284561 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.284730 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.284869 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.284979 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285086 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285241 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285359 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285496 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285612 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285734 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285844 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.285843 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.298390 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300563 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300600 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300723 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.300787 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnpx\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.301634 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.801611344 +0000 UTC m=+86.464093074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.306007 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.311587 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.311872 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.312100 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.312674 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.317226 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lzcth"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.317743 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.336020 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.336450 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.336690 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.336933 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.337172 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.337432 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.337603 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.337882 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.338037 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.338090 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.338260 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.340506 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.340696 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.340799 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.342513 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.342923 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.343160 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.344893 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.345439 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.346013 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.346037 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.347953 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-59qwc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.348890 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.350023 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.350607 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.350845 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.351442 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.351942 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.352487 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.353130 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.353161 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.353859 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.354574 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.354997 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.355902 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-stzdc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.356310 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.357112 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.357550 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.358610 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pl7n"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.359899 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cgmxb"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.360674 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.361168 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hlq5"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.362543 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jdt8x"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.363712 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fsbcb"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.364484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.365516 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sf6rs"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.366946 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.368488 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nq9nc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.370061 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.370610 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-c6vdt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.372309 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.372852 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.374331 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.375315 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cjg4p"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.377295 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.378558 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-66kvx"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.380446 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49gpd"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.382016 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.383518 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.387718 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.390544 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.392184 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.396619 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.398640 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401433 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.401650 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.901626646 +0000 UTC m=+86.564108376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401679 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-etcd-client\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssrh\" (UniqueName: \"kubernetes.io/projected/f0a48ada-f9d6-49a1-a109-11b05e4b757c-kube-api-access-bssrh\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401728 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3917efc-ed36-4d2a-9af3-690506b9e302-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4d7v\" (UniqueName: \"kubernetes.io/projected/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-kube-api-access-c4d7v\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401760 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401775 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7204cb20-aa3b-4484-bc0e-64155cb7f734-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401792 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-encryption-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69dm\" (UniqueName: \"kubernetes.io/projected/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-kube-api-access-z69dm\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401849 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2z4\" (UniqueName: \"kubernetes.io/projected/a9581480-5267-4031-b97c-cf5e5546448e-kube-api-access-lm2z4\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401873 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401889 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16cb6fb-6e9e-440b-96df-50841a2e14d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9581480-5267-4031-b97c-cf5e5546448e-metrics-tls\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401948 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401963 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-images\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401978 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.401993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd319a7-061b-48a6-8061-fa2f200e749f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnpx\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402033 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-config\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfj6\" (UniqueName: \"kubernetes.io/projected/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-kube-api-access-4pfj6\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402064 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22f19033-2027-4208-ae92-8955ad8d744f-proxy-tls\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402079 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-etcd-serving-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402097 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402112 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-images\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402127 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-config\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402143 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-oauth-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402164 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3917efc-ed36-4d2a-9af3-690506b9e302-proxy-tls\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402179 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxzs\" (UniqueName: \"kubernetes.io/projected/e3917efc-ed36-4d2a-9af3-690506b9e302-kube-api-access-qlxzs\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404332 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66p7w\" (UniqueName: \"kubernetes.io/projected/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-kube-api-access-66p7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-serving-cert\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16cb6fb-6e9e-440b-96df-50841a2e14d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.402251 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404579 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-config\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404594 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404629 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404701 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-key\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404786 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-image-import-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404843 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404860 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-service-ca\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404879 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-trusted-ca\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404894 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-service-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404908 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-etcd-client\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404923 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404945 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-serving-cert\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.404974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6xqb\" (UniqueName: \"kubernetes.io/projected/8a7ae477-ca56-4362-a334-a2915d71fdf0-kube-api-access-f6xqb\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405004 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br575\" (UniqueName: \"kubernetes.io/projected/7204cb20-aa3b-4484-bc0e-64155cb7f734-kube-api-access-br575\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405024 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-audit\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405010 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lzcth"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405104 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405124 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-serving-cert\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405438 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbwc\" (UniqueName: \"kubernetes.io/projected/22f19033-2027-4208-ae92-8955ad8d744f-kube-api-access-trbwc\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.405894 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-oauth-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406325 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406352 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406478 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406504 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-audit-dir\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406559 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406594 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a7ae477-ca56-4362-a334-a2915d71fdf0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.406988 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407246 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsc67\" (UniqueName: \"kubernetes.io/projected/b921d77a-e870-46f5-afe8-e071962b3881-kube-api-access-fsc67\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407286 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-trusted-ca-bundle\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407690 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rjr\" (UniqueName: \"kubernetes.io/projected/b16cb6fb-6e9e-440b-96df-50841a2e14d3-kube-api-access-x6rjr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407754 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kqt\" (UniqueName: \"kubernetes.io/projected/40766993-1700-48b0-97ab-775d3076167f-kube-api-access-p9kqt\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407783 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd319a7-061b-48a6-8061-fa2f200e749f-config\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd319a7-061b-48a6-8061-fa2f200e749f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-config\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.408184 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.908169253 +0000 UTC m=+86.570650983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.407853 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-serving-cert\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.410297 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-cabundle\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.410329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.410353 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-node-pullsecrets\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.411893 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.412916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.413031 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.414994 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.418049 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.419511 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nvrln"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.420178 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.421154 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.422306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.422750 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.424115 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-59qwc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.426192 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.427256 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cgmxb"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.428375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.429462 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.430536 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.431589 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.433167 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.434197 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-stzdc"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.435199 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nvrln"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.436966 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wqrlt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.438032 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.438033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wqrlt"] Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.451920 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.472553 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.493187 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.510864 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.511012 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.010989121 +0000 UTC m=+86.673470851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511136 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511161 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf48580-f8cb-46a8-8d47-f1317a13eca4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511187 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-serving-cert\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511256 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-srv-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511279 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rk7j\" (UniqueName: \"kubernetes.io/projected/de3d9a69-800a-4146-9ed2-b74e44da14ed-kube-api-access-5rk7j\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-key\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511428 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl48w\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-kube-api-access-wl48w\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511446 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmbz\" (UniqueName: \"kubernetes.io/projected/1ff99da0-58a5-4e05-8e55-88f24bb0a962-kube-api-access-bbmbz\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511531 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-registration-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-image-import-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511589 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-service-ca\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511632 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-service-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511678 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-etcd-client\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj4hc\" (UniqueName: \"kubernetes.io/projected/d1eb69af-0de8-4778-af20-7a12273e384d-kube-api-access-gj4hc\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511766 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td54k\" (UniqueName: \"kubernetes.io/projected/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-kube-api-access-td54k\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511871 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-socket-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511918 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbwc\" (UniqueName: \"kubernetes.io/projected/22f19033-2027-4208-ae92-8955ad8d744f-kube-api-access-trbwc\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-oauth-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511944 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.511957 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bb382c-c2d4-4118-aee5-1df968d79d25-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512024 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bd443a2-dd2a-4278-854c-0d7c403b2603-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512096 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-trusted-ca-bundle\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512163 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdlt\" (UniqueName: \"kubernetes.io/projected/fe3d7e88-f7a4-428b-9415-1dd1b4047015-kube-api-access-6gdlt\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512188 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhp2f\" (UniqueName: \"kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512248 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kqt\" (UniqueName: \"kubernetes.io/projected/40766993-1700-48b0-97ab-775d3076167f-kube-api-access-p9kqt\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512276 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd319a7-061b-48a6-8061-fa2f200e749f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9kpr\" (UniqueName: \"kubernetes.io/projected/4bd443a2-dd2a-4278-854c-0d7c403b2603-kube-api-access-z9kpr\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-cabundle\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512367 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-node-pullsecrets\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff99da0-58a5-4e05-8e55-88f24bb0a962-service-ca-bundle\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512434 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512454 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf48580-f8cb-46a8-8d47-f1317a13eca4-config\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512500 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512525 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmf9l\" (UniqueName: \"kubernetes.io/projected/533808f8-dc48-431c-bfe9-6019090f4832-kube-api-access-cmf9l\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-auth-proxy-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512565 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/750cc1d2-faa4-46ca-86a1-57720f4d922a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512590 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4d7v\" (UniqueName: \"kubernetes.io/projected/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-kube-api-access-c4d7v\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512636 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9185d1-abcc-43fc-a1af-42834e838dae-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512680 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-mountpoint-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512713 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcdf\" (UniqueName: \"kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512731 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512751 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2z4\" (UniqueName: \"kubernetes.io/projected/a9581480-5267-4031-b97c-cf5e5546448e-kube-api-access-lm2z4\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512833 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512854 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22026c31-b3d5-4041-8835-4a0f97f456e6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512876 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjvm\" (UniqueName: \"kubernetes.io/projected/46d3747c-fa48-416e-84b8-c0d7ad4394f2-kube-api-access-jqjvm\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512920 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512942 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750cc1d2-faa4-46ca-86a1-57720f4d922a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9581480-5267-4031-b97c-cf5e5546448e-metrics-tls\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.512994 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bf48580-f8cb-46a8-8d47-f1317a13eca4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513013 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513037 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-images\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22f19033-2027-4208-ae92-8955ad8d744f-proxy-tls\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513083 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-etcd-serving-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513097 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513112 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513128 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfj6\" (UniqueName: \"kubernetes.io/projected/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-kube-api-access-4pfj6\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513143 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513161 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8np\" (UniqueName: \"kubernetes.io/projected/22026c31-b3d5-4041-8835-4a0f97f456e6-kube-api-access-db8np\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e9185d1-abcc-43fc-a1af-42834e838dae-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513221 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-oauth-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3917efc-ed36-4d2a-9af3-690506b9e302-proxy-tls\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxzs\" (UniqueName: \"kubernetes.io/projected/e3917efc-ed36-4d2a-9af3-690506b9e302-kube-api-access-qlxzs\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513310 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513349 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513374 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16cb6fb-6e9e-440b-96df-50841a2e14d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513399 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66p7w\" (UniqueName: \"kubernetes.io/projected/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-kube-api-access-66p7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513421 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513452 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-config\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bb382c-c2d4-4118-aee5-1df968d79d25-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513491 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513519 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-trusted-ca\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513552 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8rm6\" (UniqueName: \"kubernetes.io/projected/3e9185d1-abcc-43fc-a1af-42834e838dae-kube-api-access-q8rm6\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513573 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513599 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513619 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-serving-cert\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6xqb\" (UniqueName: \"kubernetes.io/projected/8a7ae477-ca56-4362-a334-a2915d71fdf0-kube-api-access-f6xqb\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br575\" (UniqueName: \"kubernetes.io/projected/7204cb20-aa3b-4484-bc0e-64155cb7f734-kube-api-access-br575\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-audit\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513726 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513748 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513792 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-serving-cert\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513813 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513837 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prxl\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-kube-api-access-9prxl\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513857 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-audit-dir\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-tmpfs\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-csi-data-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513920 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a7ae477-ca56-4362-a334-a2915d71fdf0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513936 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsc67\" (UniqueName: \"kubernetes.io/projected/b921d77a-e870-46f5-afe8-e071962b3881-kube-api-access-fsc67\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513952 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnqkz\" (UniqueName: \"kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513970 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rjr\" (UniqueName: \"kubernetes.io/projected/b16cb6fb-6e9e-440b-96df-50841a2e14d3-kube-api-access-x6rjr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.513989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2fl\" (UniqueName: \"kubernetes.io/projected/91bb382c-c2d4-4118-aee5-1df968d79d25-kube-api-access-5j2fl\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514029 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514045 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-metrics-certs\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514059 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-default-certificate\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514089 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514107 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd319a7-061b-48a6-8061-fa2f200e749f-config\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-config\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-serving-cert\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514154 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514171 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514187 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz6qg\" (UniqueName: \"kubernetes.io/projected/d3f4ad9e-7db1-42fa-94ed-232c8b908911-kube-api-access-pz6qg\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514230 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514247 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-etcd-client\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2zkl\" (UniqueName: \"kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514279 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsfsj\" (UniqueName: \"kubernetes.io/projected/6fa18d98-d246-4a68-99bb-98a50a2b1e87-kube-api-access-gsfsj\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514296 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3917efc-ed36-4d2a-9af3-690506b9e302-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514332 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plq7\" (UniqueName: \"kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514349 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssrh\" (UniqueName: \"kubernetes.io/projected/f0a48ada-f9d6-49a1-a109-11b05e4b757c-kube-api-access-bssrh\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514368 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7204cb20-aa3b-4484-bc0e-64155cb7f734-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514401 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkb9d\" (UniqueName: \"kubernetes.io/projected/df275db8-c500-4448-8073-97e037ad189f-kube-api-access-kkb9d\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514448 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-encryption-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514465 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69dm\" (UniqueName: \"kubernetes.io/projected/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-kube-api-access-z69dm\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514483 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3f4ad9e-7db1-42fa-94ed-232c8b908911-machine-approver-tls\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514516 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54kk\" (UniqueName: \"kubernetes.io/projected/e6b781bc-66b7-427e-8175-bd578310559e-kube-api-access-j54kk\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514532 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-plugins-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514562 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrlb\" (UniqueName: \"kubernetes.io/projected/871d0c50-a5cc-4de2-9566-0dc749cb24f2-kube-api-access-lhrlb\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514590 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16cb6fb-6e9e-440b-96df-50841a2e14d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514626 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xnjf\" (UniqueName: \"kubernetes.io/projected/1a702977-9ffe-4c8a-b5d8-c49bff5b7030-kube-api-access-8xnjf\") pod \"migrator-59844c95c7-gxp24\" (UID: \"1a702977-9ffe-4c8a-b5d8-c49bff5b7030\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514682 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514700 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd319a7-061b-48a6-8061-fa2f200e749f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514920 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qk7\" (UniqueName: \"kubernetes.io/projected/5a895309-39e8-42bb-8df7-821c0c600504-kube-api-access-79qk7\") pod \"downloads-7954f5f757-c6vdt\" (UID: \"5a895309-39e8-42bb-8df7-821c0c600504\") " pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.514944 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-config\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515064 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6b781bc-66b7-427e-8175-bd578310559e-audit-dir\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-config\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515236 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-images\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515259 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-stats-auth\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515275 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-key\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.515700 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-serving-cert\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.517105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-signing-cabundle\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.517303 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-oauth-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.517463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-node-pullsecrets\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.517728 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b921d77a-e870-46f5-afe8-e071962b3881-audit-dir\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.518237 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.518269 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.018257128 +0000 UTC m=+86.680738848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.518863 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-service-ca\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.518919 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-etcd-serving-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519313 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16cb6fb-6e9e-440b-96df-50841a2e14d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-audit\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519516 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-trusted-ca-bundle\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519658 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519729 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-image-import-ca\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519808 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.519914 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-etcd-service-ca\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.520472 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.521748 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a7ae477-ca56-4362-a334-a2915d71fdf0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.522084 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.522576 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-config\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.523267 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd319a7-061b-48a6-8061-fa2f200e749f-config\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.523619 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-images\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.523810 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-etcd-client\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.525773 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7204cb20-aa3b-4484-bc0e-64155cb7f734-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.526091 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40766993-1700-48b0-97ab-775d3076167f-config\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.526323 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-serving-cert\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.526532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd319a7-061b-48a6-8061-fa2f200e749f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.526698 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22f19033-2027-4208-ae92-8955ad8d744f-proxy-tls\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.526762 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16cb6fb-6e9e-440b-96df-50841a2e14d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527309 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f19033-2027-4208-ae92-8955ad8d744f-images\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527391 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527485 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3917efc-ed36-4d2a-9af3-690506b9e302-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-encryption-config\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b921d77a-e870-46f5-afe8-e071962b3881-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.527975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-trusted-ca\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.528181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-config\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.528457 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-oauth-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.528552 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-serving-cert\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.528911 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.528969 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40766993-1700-48b0-97ab-775d3076167f-serving-cert\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.529579 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-serving-cert\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.530137 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0a48ada-f9d6-49a1-a109-11b05e4b757c-console-config\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.530881 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ae477-ca56-4362-a334-a2915d71fdf0-config\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.531824 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.534523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9581480-5267-4031-b97c-cf5e5546448e-metrics-tls\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.534540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3917efc-ed36-4d2a-9af3-690506b9e302-proxy-tls\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.536362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b921d77a-e870-46f5-afe8-e071962b3881-etcd-client\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.551527 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.572062 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.591544 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.611644 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616671 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616762 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-tmpfs\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616800 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prxl\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-kube-api-access-9prxl\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnqkz\" (UniqueName: \"kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616841 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616858 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-csi-data-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.616890 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.116867192 +0000 UTC m=+86.779348932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616937 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-csi-data-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2fl\" (UniqueName: \"kubernetes.io/projected/91bb382c-c2d4-4118-aee5-1df968d79d25-kube-api-access-5j2fl\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.616988 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-metrics-certs\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617036 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617060 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-default-certificate\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617082 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz6qg\" (UniqueName: \"kubernetes.io/projected/d3f4ad9e-7db1-42fa-94ed-232c8b908911-kube-api-access-pz6qg\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617159 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617220 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2zkl\" (UniqueName: \"kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsfsj\" (UniqueName: \"kubernetes.io/projected/6fa18d98-d246-4a68-99bb-98a50a2b1e87-kube-api-access-gsfsj\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617266 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617271 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-tmpfs\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plq7\" (UniqueName: \"kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617329 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617360 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617381 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkb9d\" (UniqueName: \"kubernetes.io/projected/df275db8-c500-4448-8073-97e037ad189f-kube-api-access-kkb9d\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3f4ad9e-7db1-42fa-94ed-232c8b908911-machine-approver-tls\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54kk\" (UniqueName: \"kubernetes.io/projected/e6b781bc-66b7-427e-8175-bd578310559e-kube-api-access-j54kk\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617468 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-plugins-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617537 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrlb\" (UniqueName: \"kubernetes.io/projected/871d0c50-a5cc-4de2-9566-0dc749cb24f2-kube-api-access-lhrlb\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xnjf\" (UniqueName: \"kubernetes.io/projected/1a702977-9ffe-4c8a-b5d8-c49bff5b7030-kube-api-access-8xnjf\") pod \"migrator-59844c95c7-gxp24\" (UID: \"1a702977-9ffe-4c8a-b5d8-c49bff5b7030\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617583 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qk7\" (UniqueName: \"kubernetes.io/projected/5a895309-39e8-42bb-8df7-821c0c600504-kube-api-access-79qk7\") pod \"downloads-7954f5f757-c6vdt\" (UID: \"5a895309-39e8-42bb-8df7-821c0c600504\") " pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617630 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617651 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617703 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6b781bc-66b7-427e-8175-bd578310559e-audit-dir\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617727 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-stats-auth\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617804 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf48580-f8cb-46a8-8d47-f1317a13eca4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617831 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-srv-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rk7j\" (UniqueName: \"kubernetes.io/projected/de3d9a69-800a-4146-9ed2-b74e44da14ed-kube-api-access-5rk7j\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617873 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl48w\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-kube-api-access-wl48w\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmbz\" (UniqueName: \"kubernetes.io/projected/1ff99da0-58a5-4e05-8e55-88f24bb0a962-kube-api-access-bbmbz\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617920 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-registration-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617964 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.617985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td54k\" (UniqueName: \"kubernetes.io/projected/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-kube-api-access-td54k\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618006 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj4hc\" (UniqueName: \"kubernetes.io/projected/d1eb69af-0de8-4778-af20-7a12273e384d-kube-api-access-gj4hc\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618027 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618048 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-socket-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618092 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618150 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bb382c-c2d4-4118-aee5-1df968d79d25-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bd443a2-dd2a-4278-854c-0d7c403b2603-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdlt\" (UniqueName: \"kubernetes.io/projected/fe3d7e88-f7a4-428b-9415-1dd1b4047015-kube-api-access-6gdlt\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618273 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhp2f\" (UniqueName: \"kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618306 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9kpr\" (UniqueName: \"kubernetes.io/projected/4bd443a2-dd2a-4278-854c-0d7c403b2603-kube-api-access-z9kpr\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff99da0-58a5-4e05-8e55-88f24bb0a962-service-ca-bundle\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf48580-f8cb-46a8-8d47-f1317a13eca4-config\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618400 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmf9l\" (UniqueName: \"kubernetes.io/projected/533808f8-dc48-431c-bfe9-6019090f4832-kube-api-access-cmf9l\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618441 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618460 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-auth-proxy-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618484 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/750cc1d2-faa4-46ca-86a1-57720f4d922a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9185d1-abcc-43fc-a1af-42834e838dae-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618559 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618579 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-mountpoint-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcdf\" (UniqueName: \"kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618666 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618689 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22026c31-b3d5-4041-8835-4a0f97f456e6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618745 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618770 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjvm\" (UniqueName: \"kubernetes.io/projected/46d3747c-fa48-416e-84b8-c0d7ad4394f2-kube-api-access-jqjvm\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618792 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750cc1d2-faa4-46ca-86a1-57720f4d922a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618931 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bf48580-f8cb-46a8-8d47-f1317a13eca4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.618978 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8np\" (UniqueName: \"kubernetes.io/projected/22026c31-b3d5-4041-8835-4a0f97f456e6-kube-api-access-db8np\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e9185d1-abcc-43fc-a1af-42834e838dae-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619132 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619161 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619229 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619271 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619303 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bb382c-c2d4-4118-aee5-1df968d79d25-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8rm6\" (UniqueName: \"kubernetes.io/projected/3e9185d1-abcc-43fc-a1af-42834e838dae-kube-api-access-q8rm6\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619347 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619368 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619392 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619471 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.619490 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.620181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.620490 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-metrics-certs\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.620999 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-default-certificate\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.621352 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-registration-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.621827 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-plugins-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.622091 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.622141 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.622280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6b781bc-66b7-427e-8175-bd578310559e-audit-dir\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.622354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-socket-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.622719 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.12269969 +0000 UTC m=+86.785181430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.623085 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/871d0c50-a5cc-4de2-9566-0dc749cb24f2-mountpoint-dir\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.623366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.623823 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf48580-f8cb-46a8-8d47-f1317a13eca4-config\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.624535 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff99da0-58a5-4e05-8e55-88f24bb0a962-service-ca-bundle\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.624542 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.624798 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.625117 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750cc1d2-faa4-46ca-86a1-57720f4d922a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.624687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e9185d1-abcc-43fc-a1af-42834e838dae-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.627448 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf48580-f8cb-46a8-8d47-f1317a13eca4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.627502 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.627697 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ff99da0-58a5-4e05-8e55-88f24bb0a962-stats-auth\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.627846 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.629271 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/750cc1d2-faa4-46ca-86a1-57720f4d922a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.629599 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-profile-collector-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.630291 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22026c31-b3d5-4041-8835-4a0f97f456e6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.631443 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9185d1-abcc-43fc-a1af-42834e838dae-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.631638 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.640747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.640768 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.641240 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.641444 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.651660 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.662763 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.671716 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.692111 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.711482 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.720826 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.721719 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.221689664 +0000 UTC m=+86.884171394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.731675 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.757896 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.763512 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.778440 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.784181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.791630 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.811631 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.823028 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.823433 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.323415902 +0000 UTC m=+86.985897632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.852315 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.869692 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/46d3747c-fa48-416e-84b8-c0d7ad4394f2-srv-cert\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.872701 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.892618 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.911709 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.915407 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3f4ad9e-7db1-42fa-94ed-232c8b908911-machine-approver-tls\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.924851 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.925012 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.424984024 +0000 UTC m=+87.087465754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.926089 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:33 crc kubenswrapper[4747]: E1128 13:20:33.926793 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.426771873 +0000 UTC m=+87.089253613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.931663 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.935290 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-auth-proxy-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.952666 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.962935 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4ad9e-7db1-42fa-94ed-232c8b908911-config\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.971559 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 28 13:20:33 crc kubenswrapper[4747]: I1128 13:20:33.995730 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.012554 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.027611 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.028607 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.528581043 +0000 UTC m=+87.191062773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.031068 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.039579 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.063305 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.069997 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.072910 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.077452 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.091188 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.112515 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.129751 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.132410 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.632380808 +0000 UTC m=+87.294862548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.133307 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.135652 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.152030 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.172044 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.172766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" event={"ID":"f79e5c0e-cf95-4dfb-a3e8-74dc9c7116ad","Type":"ContainerStarted","Data":"795c5265fb36f6516acee8cbcabfb8464372303bc76ccefe2e3e45d20cf8b5b8"} Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.176547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bb382c-c2d4-4118-aee5-1df968d79d25-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.193687 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.212382 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.216467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bb382c-c2d4-4118-aee5-1df968d79d25-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.231239 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.231364 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.731344621 +0000 UTC m=+87.393826341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.231590 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.231984 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.731973668 +0000 UTC m=+87.394455398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.232665 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.259165 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.270581 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.271193 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.289603 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.291829 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.299073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.311919 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.316720 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.331847 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.334614 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.334821 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.834795255 +0000 UTC m=+87.497277025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.335504 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.335951 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.835931376 +0000 UTC m=+87.498413106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.350720 4747 request.go:700] Waited for 1.001176934s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-system-router-certs&limit=500&resourceVersion=0 Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.353160 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.366523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.387243 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.397699 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.400598 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.407482 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.413150 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.419518 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.432467 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.436934 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.437078 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.937050698 +0000 UTC m=+87.599532458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.437313 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.437852 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.937834629 +0000 UTC m=+87.600316389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.446851 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.451698 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.471770 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.492220 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.495727 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.511356 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.521276 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.531730 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.536061 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.539360 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.539542 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.039513356 +0000 UTC m=+87.701995136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.541784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.543433 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.043415622 +0000 UTC m=+87.705897372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.552302 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.559100 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bd443a2-dd2a-4278-854c-0d7c403b2603-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.572714 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.591948 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.612244 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.618020 4747 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.618109 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert podName:e72da77b-12ea-4b7a-bc1c-6d157c393bc0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.118089076 +0000 UTC m=+87.780570806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert") pod "packageserver-d55dfcdfc-lfdbh" (UID: "e72da77b-12ea-4b7a-bc1c-6d157c393bc0") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.618103 4747 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.620367 4747 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.620904 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls podName:533808f8-dc48-431c-bfe9-6019090f4832 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.118254081 +0000 UTC m=+87.780735861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls") pod "dns-default-cgmxb" (UID: "533808f8-dc48-431c-bfe9-6019090f4832") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.620990 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token podName:d1eb69af-0de8-4778-af20-7a12273e384d nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.120967854 +0000 UTC m=+87.783449654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token") pod "machine-config-server-fsbcb" (UID: "d1eb69af-0de8-4778-af20-7a12273e384d") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621017 4747 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621044 4747 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621050 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.121042906 +0000 UTC m=+87.783524636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621107 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume podName:533808f8-dc48-431c-bfe9-6019090f4832 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.121089518 +0000 UTC m=+87.783571288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume") pod "dns-default-cgmxb" (UID: "533808f8-dc48-431c-bfe9-6019090f4832") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621794 4747 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.621910 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert podName:e72da77b-12ea-4b7a-bc1c-6d157c393bc0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.121880519 +0000 UTC m=+87.784362289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert") pod "packageserver-d55dfcdfc-lfdbh" (UID: "e72da77b-12ea-4b7a-bc1c-6d157c393bc0") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.622066 4747 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.622105 4747 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.622154 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert podName:fe3d7e88-f7a4-428b-9415-1dd1b4047015 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.122140476 +0000 UTC m=+87.784622246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-n7b75" (UID: "fe3d7e88-f7a4-428b-9415-1dd1b4047015") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.622189 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs podName:d1eb69af-0de8-4778-af20-7a12273e384d nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.122176147 +0000 UTC m=+87.784657917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs") pod "machine-config-server-fsbcb" (UID: "d1eb69af-0de8-4778-af20-7a12273e384d") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.623870 4747 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.623980 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert podName:df275db8-c500-4448-8073-97e037ad189f nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.123955595 +0000 UTC m=+87.786437435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert") pod "olm-operator-6b444d44fb-6292s" (UID: "df275db8-c500-4448-8073-97e037ad189f") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.624015 4747 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.624087 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.124070129 +0000 UTC m=+87.786551889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.624933 4747 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625118 4747 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625181 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config podName:fe3d7e88-f7a4-428b-9415-1dd1b4047015 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125166278 +0000 UTC m=+87.787648038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config") pod "openshift-apiserver-operator-796bbdcf4f-n7b75" (UID: "fe3d7e88-f7a4-428b-9415-1dd1b4047015") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625179 4747 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625254 4747 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625298 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125285912 +0000 UTC m=+87.787767682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625324 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125312292 +0000 UTC m=+87.787794062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625322 4747 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625414 4747 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625510 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert podName:de3d9a69-800a-4146-9ed2-b74e44da14ed nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125498187 +0000 UTC m=+87.787979917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert") pod "ingress-canary-nvrln" (UID: "de3d9a69-800a-4146-9ed2-b74e44da14ed") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625351 4747 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625535 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125527528 +0000 UTC m=+87.788009258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625355 4747 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625573 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config podName:6fa18d98-d246-4a68-99bb-98a50a2b1e87 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.125565519 +0000 UTC m=+87.788047249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config") pod "service-ca-operator-777779d784-stzdc" (UID: "6fa18d98-d246-4a68-99bb-98a50a2b1e87") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625589 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert podName:6fa18d98-d246-4a68-99bb-98a50a2b1e87 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.12558298 +0000 UTC m=+87.788064710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert") pod "service-ca-operator-777779d784-stzdc" (UID: "6fa18d98-d246-4a68-99bb-98a50a2b1e87") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.625602 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client podName:e6b781bc-66b7-427e-8175-bd578310559e nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.12559618 +0000 UTC m=+87.788077910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client") pod "apiserver-7bbb656c7d-w8vhf" (UID: "e6b781bc-66b7-427e-8175-bd578310559e") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.633978 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.643842 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.644308 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.144272766 +0000 UTC m=+87.806754566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.644825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.646279 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.14625216 +0000 UTC m=+87.808733920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.651610 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.672545 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.693082 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.711456 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.731902 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.747963 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.748291 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.248249845 +0000 UTC m=+87.910731585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.748665 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.749095 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.249075278 +0000 UTC m=+87.911557018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.751091 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.773663 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.792433 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.811720 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.833545 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.850968 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.851235 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.351180266 +0000 UTC m=+88.013662036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.851977 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.852639 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.352615065 +0000 UTC m=+88.015096865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.853356 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.872413 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.892276 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.913397 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.933465 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.952592 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.954092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4747]: E1128 13:20:34.954866 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.454827666 +0000 UTC m=+88.117309446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.972500 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 28 13:20:34 crc kubenswrapper[4747]: I1128 13:20:34.991928 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.012313 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.032771 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.052078 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.056628 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.057416 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.557383317 +0000 UTC m=+88.219865077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.072506 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.092566 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.111687 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.132428 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.152634 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.157635 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.157892 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.657854971 +0000 UTC m=+88.320336751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158449 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158827 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.158952 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159085 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159191 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159362 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.159443 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.659421503 +0000 UTC m=+88.321903243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159615 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159650 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159727 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159798 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159822 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159867 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.159889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.160679 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa18d98-d246-4a68-99bb-98a50a2b1e87-config\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.161057 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-audit-policies\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.161137 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533808f8-dc48-431c-bfe9-6019090f4832-config-volume\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.161119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.162561 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e6b781bc-66b7-427e-8175-bd578310559e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.162909 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3d7e88-f7a4-428b-9415-1dd1b4047015-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.165960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-serving-cert\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.166646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df275db8-c500-4448-8073-97e037ad189f-srv-cert\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.167155 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3d7e88-f7a4-428b-9415-1dd1b4047015-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.167409 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-apiservice-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.168190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa18d98-d246-4a68-99bb-98a50a2b1e87-serving-cert\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.168283 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533808f8-dc48-431c-bfe9-6019090f4832-metrics-tls\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.168750 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-webhook-cert\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.170265 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-encryption-config\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.171841 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e6b781bc-66b7-427e-8175-bd578310559e-etcd-client\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.171997 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.185329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-certs\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.191511 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.213845 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.224616 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1eb69af-0de8-4778-af20-7a12273e384d-node-bootstrap-token\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.250486 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnpx\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.261084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.261258 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.761234024 +0000 UTC m=+88.423715754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.262155 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.262646 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.762627432 +0000 UTC m=+88.425109282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.272657 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.274560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.293600 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.313644 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.326373 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de3d9a69-800a-4146-9ed2-b74e44da14ed-cert\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.332038 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.352703 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.363665 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.363995 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.863954539 +0000 UTC m=+88.526436279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.364380 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.364881 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.864869224 +0000 UTC m=+88.527351164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.369811 4747 request.go:700] Waited for 1.931516457s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.373043 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.392291 4747 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.432033 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd319a7-061b-48a6-8061-fa2f200e749f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ksn8f\" (UID: \"edd319a7-061b-48a6-8061-fa2f200e749f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.455062 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbwc\" (UniqueName: \"kubernetes.io/projected/22f19033-2027-4208-ae92-8955ad8d744f-kube-api-access-trbwc\") pod \"machine-config-operator-74547568cd-bmpd6\" (UID: \"22f19033-2027-4208-ae92-8955ad8d744f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.466287 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.467139 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.967102955 +0000 UTC m=+88.629584735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.472160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rjr\" (UniqueName: \"kubernetes.io/projected/b16cb6fb-6e9e-440b-96df-50841a2e14d3-kube-api-access-x6rjr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7rvs2\" (UID: \"b16cb6fb-6e9e-440b-96df-50841a2e14d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.474691 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.483909 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.489824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6jjwc\" (UID: \"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.510337 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6xqb\" (UniqueName: \"kubernetes.io/projected/8a7ae477-ca56-4362-a334-a2915d71fdf0-kube-api-access-f6xqb\") pod \"machine-api-operator-5694c8668f-4hlq5\" (UID: \"8a7ae477-ca56-4362-a334-a2915d71fdf0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.530153 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4d7v\" (UniqueName: \"kubernetes.io/projected/4a180bf3-5719-4d9b-8ebb-beb0315e7cac-kube-api-access-c4d7v\") pod \"console-operator-58897d9998-nq9nc\" (UID: \"4a180bf3-5719-4d9b-8ebb-beb0315e7cac\") " pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.541773 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.553878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfj6\" (UniqueName: \"kubernetes.io/projected/479245d2-229a-4cd7-b1c8-b1125d3c9ba9-kube-api-access-4pfj6\") pod \"service-ca-9c57cc56f-49gpd\" (UID: \"479245d2-229a-4cd7-b1c8-b1125d3c9ba9\") " pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.570747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.571152 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br575\" (UniqueName: \"kubernetes.io/projected/7204cb20-aa3b-4484-bc0e-64155cb7f734-kube-api-access-br575\") pod \"package-server-manager-789f6589d5-6bf44\" (UID: \"7204cb20-aa3b-4484-bc0e-64155cb7f734\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.571189 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.071176387 +0000 UTC m=+88.733658117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.595856 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66p7w\" (UniqueName: \"kubernetes.io/projected/1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed-kube-api-access-66p7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-dw4lg\" (UID: \"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.608875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsc67\" (UniqueName: \"kubernetes.io/projected/b921d77a-e870-46f5-afe8-e071962b3881-kube-api-access-fsc67\") pod \"apiserver-76f77b778f-jdt8x\" (UID: \"b921d77a-e870-46f5-afe8-e071962b3881\") " pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.628783 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kqt\" (UniqueName: \"kubernetes.io/projected/40766993-1700-48b0-97ab-775d3076167f-kube-api-access-p9kqt\") pod \"etcd-operator-b45778765-sf6rs\" (UID: \"40766993-1700-48b0-97ab-775d3076167f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.650442 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssrh\" (UniqueName: \"kubernetes.io/projected/f0a48ada-f9d6-49a1-a109-11b05e4b757c-kube-api-access-bssrh\") pod \"console-f9d7485db-cjg4p\" (UID: \"f0a48ada-f9d6-49a1-a109-11b05e4b757c\") " pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.654772 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.668742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69dm\" (UniqueName: \"kubernetes.io/projected/72e9d3f8-217e-4a29-a43b-e2b17998b6e1-kube-api-access-z69dm\") pod \"authentication-operator-69f744f599-7pl7n\" (UID: \"72e9d3f8-217e-4a29-a43b-e2b17998b6e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.672989 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.673161 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.173135412 +0000 UTC m=+88.835617142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.673439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.673758 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.173747478 +0000 UTC m=+88.836229228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.675960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.697807 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxzs\" (UniqueName: \"kubernetes.io/projected/e3917efc-ed36-4d2a-9af3-690506b9e302-kube-api-access-qlxzs\") pod \"machine-config-controller-84d6567774-rqm6f\" (UID: \"e3917efc-ed36-4d2a-9af3-690506b9e302\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.707393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2z4\" (UniqueName: \"kubernetes.io/projected/a9581480-5267-4031-b97c-cf5e5546448e-kube-api-access-lm2z4\") pod \"dns-operator-744455d44c-66kvx\" (UID: \"a9581480-5267-4031-b97c-cf5e5546448e\") " pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.728127 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnqkz\" (UniqueName: \"kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz\") pod \"route-controller-manager-6576b87f9c-67648\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.741218 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.745289 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.747494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prxl\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-kube-api-access-9prxl\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.748811 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6"] Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.753617 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.758381 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.765041 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f"] Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.766706 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2fl\" (UniqueName: \"kubernetes.io/projected/91bb382c-c2d4-4118-aee5-1df968d79d25-kube-api-access-5j2fl\") pod \"kube-storage-version-migrator-operator-b67b599dd-dsc95\" (UID: \"91bb382c-c2d4-4118-aee5-1df968d79d25\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.766908 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.774902 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.775244 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.275220939 +0000 UTC m=+88.937702679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.790878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2zkl\" (UniqueName: \"kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl\") pod \"controller-manager-879f6c89f-926zp\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.791821 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.806802 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.814373 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc"] Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.817439 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz6qg\" (UniqueName: \"kubernetes.io/projected/d3f4ad9e-7db1-42fa-94ed-232c8b908911-kube-api-access-pz6qg\") pod \"machine-approver-56656f9798-qk8hm\" (UID: \"d3f4ad9e-7db1-42fa-94ed-232c8b908911\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.830240 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.831640 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsfsj\" (UniqueName: \"kubernetes.io/projected/6fa18d98-d246-4a68-99bb-98a50a2b1e87-kube-api-access-gsfsj\") pod \"service-ca-operator-777779d784-stzdc\" (UID: \"6fa18d98-d246-4a68-99bb-98a50a2b1e87\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:35 crc kubenswrapper[4747]: W1128 13:20:35.833444 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84cb6d2e_856a_4d19_b6c3_8c6fdf2caad4.slice/crio-cbbd7c3e88f30942f7c50e294b2160c76ec3a15a508af88928f8b84a4e9a7367 WatchSource:0}: Error finding container cbbd7c3e88f30942f7c50e294b2160c76ec3a15a508af88928f8b84a4e9a7367: Status 404 returned error can't find the container with id cbbd7c3e88f30942f7c50e294b2160c76ec3a15a508af88928f8b84a4e9a7367 Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.837298 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.845737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plq7\" (UniqueName: \"kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7\") pod \"collect-profiles-29405595-wgstb\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.868527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rk7j\" (UniqueName: \"kubernetes.io/projected/de3d9a69-800a-4146-9ed2-b74e44da14ed-kube-api-access-5rk7j\") pod \"ingress-canary-nvrln\" (UID: \"de3d9a69-800a-4146-9ed2-b74e44da14ed\") " pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.877717 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.878083 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.378066018 +0000 UTC m=+89.040547748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.888510 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.896604 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.897805 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl48w\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-kube-api-access-wl48w\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.907266 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.913685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmbz\" (UniqueName: \"kubernetes.io/projected/1ff99da0-58a5-4e05-8e55-88f24bb0a962-kube-api-access-bbmbz\") pod \"router-default-5444994796-z6ppk\" (UID: \"1ff99da0-58a5-4e05-8e55-88f24bb0a962\") " pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.934142 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkb9d\" (UniqueName: \"kubernetes.io/projected/df275db8-c500-4448-8073-97e037ad189f-kube-api-access-kkb9d\") pod \"olm-operator-6b444d44fb-6292s\" (UID: \"df275db8-c500-4448-8073-97e037ad189f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.948614 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.955176 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.960714 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bf48580-f8cb-46a8-8d47-f1317a13eca4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5cpjw\" (UID: \"5bf48580-f8cb-46a8-8d47-f1317a13eca4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.961199 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.978831 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4747]: E1128 13:20:35.979918 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.479901209 +0000 UTC m=+89.142382939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.980688 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d776caa-4299-4b4c-b60c-2bbbe67b02a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ngfzt\" (UID: \"5d776caa-4299-4b4c-b60c-2bbbe67b02a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:35 crc kubenswrapper[4747]: I1128 13:20:35.997476 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jdt8x"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.016429 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.017173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54kk\" (UniqueName: \"kubernetes.io/projected/e6b781bc-66b7-427e-8175-bd578310559e-kube-api-access-j54kk\") pod \"apiserver-7bbb656c7d-w8vhf\" (UID: \"e6b781bc-66b7-427e-8175-bd578310559e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.024727 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.036869 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xnjf\" (UniqueName: \"kubernetes.io/projected/1a702977-9ffe-4c8a-b5d8-c49bff5b7030-kube-api-access-8xnjf\") pod \"migrator-59844c95c7-gxp24\" (UID: \"1a702977-9ffe-4c8a-b5d8-c49bff5b7030\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.043741 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.043894 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrlb\" (UniqueName: \"kubernetes.io/projected/871d0c50-a5cc-4de2-9566-0dc749cb24f2-kube-api-access-lhrlb\") pod \"csi-hostpathplugin-wqrlt\" (UID: \"871d0c50-a5cc-4de2-9566-0dc749cb24f2\") " pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.051300 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qk7\" (UniqueName: \"kubernetes.io/projected/5a895309-39e8-42bb-8df7-821c0c600504-kube-api-access-79qk7\") pod \"downloads-7954f5f757-c6vdt\" (UID: \"5a895309-39e8-42bb-8df7-821c0c600504\") " pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.052328 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nvrln" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.071817 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td54k\" (UniqueName: \"kubernetes.io/projected/e72da77b-12ea-4b7a-bc1c-6d157c393bc0-kube-api-access-td54k\") pod \"packageserver-d55dfcdfc-lfdbh\" (UID: \"e72da77b-12ea-4b7a-bc1c-6d157c393bc0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.074628 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.083070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.083558 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.583545019 +0000 UTC m=+89.246026749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.094889 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj4hc\" (UniqueName: \"kubernetes.io/projected/d1eb69af-0de8-4778-af20-7a12273e384d-kube-api-access-gj4hc\") pod \"machine-config-server-fsbcb\" (UID: \"d1eb69af-0de8-4778-af20-7a12273e384d\") " pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.109641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/750cc1d2-faa4-46ca-86a1-57720f4d922a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4fsxt\" (UID: \"750cc1d2-faa4-46ca-86a1-57720f4d922a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.127361 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hlq5"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.128969 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdlt\" (UniqueName: \"kubernetes.io/projected/fe3d7e88-f7a4-428b-9415-1dd1b4047015-kube-api-access-6gdlt\") pod \"openshift-apiserver-operator-796bbdcf4f-n7b75\" (UID: \"fe3d7e88-f7a4-428b-9415-1dd1b4047015\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.152703 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.153180 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.158982 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.159028 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhp2f\" (UniqueName: \"kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f\") pod \"marketplace-operator-79b997595-dr67x\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.171317 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9kpr\" (UniqueName: \"kubernetes.io/projected/4bd443a2-dd2a-4278-854c-0d7c403b2603-kube-api-access-z9kpr\") pod \"cluster-samples-operator-665b6dd947-ksgtz\" (UID: \"4bd443a2-dd2a-4278-854c-0d7c403b2603\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.175878 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.189504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.189883 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.689837221 +0000 UTC m=+89.352318961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.195377 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjvm\" (UniqueName: \"kubernetes.io/projected/46d3747c-fa48-416e-84b8-c0d7ad4394f2-kube-api-access-jqjvm\") pod \"catalog-operator-68c6474976-nxfwm\" (UID: \"46d3747c-fa48-416e-84b8-c0d7ad4394f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.214656 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.215604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmf9l\" (UniqueName: \"kubernetes.io/projected/533808f8-dc48-431c-bfe9-6019090f4832-kube-api-access-cmf9l\") pod \"dns-default-cgmxb\" (UID: \"533808f8-dc48-431c-bfe9-6019090f4832\") " pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.235429 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.244160 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.249192 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8np\" (UniqueName: \"kubernetes.io/projected/22026c31-b3d5-4041-8835-4a0f97f456e6-kube-api-access-db8np\") pod \"multus-admission-controller-857f4d67dd-lzcth\" (UID: \"22026c31-b3d5-4041-8835-4a0f97f456e6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.254018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" event={"ID":"22f19033-2027-4208-ae92-8955ad8d744f","Type":"ContainerStarted","Data":"48c44a72ea7cbd968bc7aa87bc10da25fbc6b3ad5625d1d2ce0579b782e8d6d9"} Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.256986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" event={"ID":"b921d77a-e870-46f5-afe8-e071962b3881","Type":"ContainerStarted","Data":"d754826bd01c0377164d6569858200870a9984310a48e8b6588bcdae3b8627e4"} Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.259417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" event={"ID":"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4","Type":"ContainerStarted","Data":"cbbd7c3e88f30942f7c50e294b2160c76ec3a15a508af88928f8b84a4e9a7367"} Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.261527 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" event={"ID":"edd319a7-061b-48a6-8061-fa2f200e749f","Type":"ContainerStarted","Data":"56152bcea991d475e758e09c42125b4dd1a944fb611ddbe24b81f30000896a81"} Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.263574 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" event={"ID":"d3f4ad9e-7db1-42fa-94ed-232c8b908911","Type":"ContainerStarted","Data":"72efc500f743402707c24a4e531036ec2a59e487c7753f4a54d6284d50a3875b"} Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.264098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcdf\" (UniqueName: \"kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf\") pod \"oauth-openshift-558db77b4-59qwc\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.268837 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:36 crc kubenswrapper[4747]: W1128 13:20:36.269310 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd74f5c_e178_4dd8_b8a9_9f58ca6e26ed.slice/crio-f30ecec2d391d4267100fe560f96820cde23824cee07d7a01be495484e72d306 WatchSource:0}: Error finding container f30ecec2d391d4267100fe560f96820cde23824cee07d7a01be495484e72d306: Status 404 returned error can't find the container with id f30ecec2d391d4267100fe560f96820cde23824cee07d7a01be495484e72d306 Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.273662 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.275141 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.276526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8rm6\" (UniqueName: \"kubernetes.io/projected/3e9185d1-abcc-43fc-a1af-42834e838dae-kube-api-access-q8rm6\") pod \"openshift-config-operator-7777fb866f-7dbpj\" (UID: \"3e9185d1-abcc-43fc-a1af-42834e838dae\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.284027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.291373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.291753 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.791738614 +0000 UTC m=+89.454220344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.292404 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.292651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.299593 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.306695 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.313164 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.332741 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.334467 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.344777 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fsbcb" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.352888 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.370320 4747 request.go:700] Waited for 2.728773854s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&limit=500&resourceVersion=0 Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.372035 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.393468 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.393794 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.89376297 +0000 UTC m=+89.556244700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.394025 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.394422 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:36.894405537 +0000 UTC m=+89.556887267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.469711 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.506894 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sf6rs"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.507176 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.507688 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.007672818 +0000 UTC m=+89.670154548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.507813 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.509041 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.523114 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49gpd"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.608732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.609358 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.109344885 +0000 UTC m=+89.771826615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.617829 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-66kvx"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.629485 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cjg4p"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.710304 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.710424 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.210389674 +0000 UTC m=+89.872871404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.710670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.710960 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.21095188 +0000 UTC m=+89.873433610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.816497 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.817150 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.317134678 +0000 UTC m=+89.979616408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.924123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:36 crc kubenswrapper[4747]: E1128 13:20:36.924520 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.42450505 +0000 UTC m=+90.086986780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.991726 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.992982 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb"] Nov 28 13:20:36 crc kubenswrapper[4747]: I1128 13:20:36.998225 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nq9nc"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.026904 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.027222 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.527153533 +0000 UTC m=+90.189635273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.027452 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.028232 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.528198351 +0000 UTC m=+90.190680081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.037796 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.201904 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.202673 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.70263321 +0000 UTC m=+90.365114940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.213230 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.213765 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.713741742 +0000 UTC m=+90.376223472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: W1128 13:20:37.216443 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3917efc_ed36_4d2a_9af3_690506b9e302.slice/crio-676f2ce33abf591674412da04e839da2a83d7781efec245e0ee1e3462c63a9e6 WatchSource:0}: Error finding container 676f2ce33abf591674412da04e839da2a83d7781efec245e0ee1e3462c63a9e6: Status 404 returned error can't find the container with id 676f2ce33abf591674412da04e839da2a83d7781efec245e0ee1e3462c63a9e6 Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.291762 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pl7n"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.314882 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.315223 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.815193442 +0000 UTC m=+90.477675162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.326428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" event={"ID":"4a180bf3-5719-4d9b-8ebb-beb0315e7cac","Type":"ContainerStarted","Data":"f784c615528bf4ea02672fb23751d25f6ffddc57d22b0e77ded2e116ce278f7a"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.353908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" event={"ID":"a9581480-5267-4031-b97c-cf5e5546448e","Type":"ContainerStarted","Data":"2f8ac88b1392ba2bae06cf693dca3a3cf430ddf0075d0dbc9e7166904c5bf08c"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.410786 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.423328 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nvrln"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.425602 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.426480 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:37.926466559 +0000 UTC m=+90.588948289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.433237 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" event={"ID":"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76","Type":"ContainerStarted","Data":"c73e005fc8ea79899da802b4ea823c280664385e50f49a334bc1b34634d17523"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.445076 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" event={"ID":"edd319a7-061b-48a6-8061-fa2f200e749f","Type":"ContainerStarted","Data":"74fed807645d2c0d2b69fddaab307e71bb038eb6cb60ccf0e7e231e9caff060f"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.487138 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.503056 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.532994 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.533040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" event={"ID":"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed","Type":"ContainerStarted","Data":"49aa19add0b7770a328813bad3255159da77169884f3f9293f35b138d2082d80"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.533059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" event={"ID":"1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed","Type":"ContainerStarted","Data":"f30ecec2d391d4267100fe560f96820cde23824cee07d7a01be495484e72d306"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.534659 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.539432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95"] Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.539915 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.039897274 +0000 UTC m=+90.702379004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.548904 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-stzdc"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.552741 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" event={"ID":"22f19033-2027-4208-ae92-8955ad8d744f","Type":"ContainerStarted","Data":"e700ff8e848db209e35a0d96929cc2840f9746ac5367083080abdeb56730f07f"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.552778 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" event={"ID":"22f19033-2027-4208-ae92-8955ad8d744f","Type":"ContainerStarted","Data":"d25a0cd0aa18c2b2d9cc141e5a39b5e92ace0ec5cf439dcafbed31cd9c8afd71"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.570522 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wqrlt"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.578820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" event={"ID":"84cb6d2e-856a-4d19-b6c3-8c6fdf2caad4","Type":"ContainerStarted","Data":"f6f7e04f158664a7453c0a21c44c61c3ab74accbb7c8d1020f8a38eb563bff1e"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.613249 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" event={"ID":"479245d2-229a-4cd7-b1c8-b1125d3c9ba9","Type":"ContainerStarted","Data":"d086299a9c9f3d4e4da5c2b14eba9bbd446c6ae378e6ee3db21fa76ee6fd6779"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.622259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z6ppk" event={"ID":"1ff99da0-58a5-4e05-8e55-88f24bb0a962","Type":"ContainerStarted","Data":"d6a26326727911041af9793d6317607d77c5043a0e0558e846aab592faa15cdf"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.633571 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" event={"ID":"8a7ae477-ca56-4362-a334-a2915d71fdf0","Type":"ContainerStarted","Data":"41429f080aa664dae5794382e997252f5752698952f09d9921d90ac9c7955de0"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.633607 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" event={"ID":"8a7ae477-ca56-4362-a334-a2915d71fdf0","Type":"ContainerStarted","Data":"c4d680701b75fb22feb09fc6fc14b13ffc2955471ff839d6748c0bbc907da1a6"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.647965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.649256 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.149239008 +0000 UTC m=+90.811720738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.666487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" event={"ID":"7204cb20-aa3b-4484-bc0e-64155cb7f734","Type":"ContainerStarted","Data":"b6c604259a73b2f04d65dbc6a81c4f0b51acb1642b900aef93d41e10e49f4850"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.678900 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cjg4p" event={"ID":"f0a48ada-f9d6-49a1-a109-11b05e4b757c","Type":"ContainerStarted","Data":"2d9f31f5b9d5fd4e0b948254e7dd2dbca90aaf1cf375deecb545e8e99e82e674"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.722785 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fsbcb" event={"ID":"d1eb69af-0de8-4778-af20-7a12273e384d","Type":"ContainerStarted","Data":"458761f005b06191b8a102b0beab99f7046bee5a5ef7c0b808076856adf77d05"} Nov 28 13:20:37 crc kubenswrapper[4747]: W1128 13:20:37.724262 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf48580_f8cb_46a8_8d47_f1317a13eca4.slice/crio-467e5a1e99af94e0dec67e7dbc3b02dfc4b059f4e9d3d67f04be5b97de4b61a3 WatchSource:0}: Error finding container 467e5a1e99af94e0dec67e7dbc3b02dfc4b059f4e9d3d67f04be5b97de4b61a3: Status 404 returned error can't find the container with id 467e5a1e99af94e0dec67e7dbc3b02dfc4b059f4e9d3d67f04be5b97de4b61a3 Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.760372 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.766149 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.266124447 +0000 UTC m=+90.928606167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.780491 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" event={"ID":"b16cb6fb-6e9e-440b-96df-50841a2e14d3","Type":"ContainerStarted","Data":"71e001a23ac9b3a73aba14bf3729a009d27a5d176c32599002c08eb37a3e1334"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.789816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" event={"ID":"40766993-1700-48b0-97ab-775d3076167f","Type":"ContainerStarted","Data":"cea5d7312c5e63a226ac41da9530529a9fa0100574f02f3732759cc9ec42b6ea"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.817665 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" event={"ID":"d3f4ad9e-7db1-42fa-94ed-232c8b908911","Type":"ContainerStarted","Data":"2c6dacc1d5c024761244a1cff9deaa661c7bc8b2f10d2ed5eb0b2ceb2bb4e022"} Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.819435 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jm2h8" podStartSLOduration=71.819407842 podStartE2EDuration="1m11.819407842s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:37.80346988 +0000 UTC m=+90.465951600" watchObservedRunningTime="2025-11-28 13:20:37.819407842 +0000 UTC m=+90.481889572" Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.819655 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.866565 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.867773 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.367755333 +0000 UTC m=+91.030237063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.920952 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.936286 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.953268 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-c6vdt"] Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.971504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.972852 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.472829512 +0000 UTC m=+91.135311242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.977381 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:37 crc kubenswrapper[4747]: E1128 13:20:37.977914 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.477902219 +0000 UTC m=+91.140383949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:37 crc kubenswrapper[4747]: I1128 13:20:37.982689 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24"] Nov 28 13:20:37 crc kubenswrapper[4747]: W1128 13:20:37.984783 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d776caa_4299_4b4c_b60c_2bbbe67b02a0.slice/crio-c1ccab4944941fd56f88c88af011fbbc256c916f20e7c8754d29af3a5df10e70 WatchSource:0}: Error finding container c1ccab4944941fd56f88c88af011fbbc256c916f20e7c8754d29af3a5df10e70: Status 404 returned error can't find the container with id c1ccab4944941fd56f88c88af011fbbc256c916f20e7c8754d29af3a5df10e70 Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.003534 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.011883 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-59qwc"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.017801 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ksn8f" podStartSLOduration=72.01778342 podStartE2EDuration="1m12.01778342s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.014707427 +0000 UTC m=+90.677189157" watchObservedRunningTime="2025-11-28 13:20:38.01778342 +0000 UTC m=+90.680265150" Nov 28 13:20:38 crc kubenswrapper[4747]: W1128 13:20:38.060549 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d3747c_fa48_416e_84b8_c0d7ad4394f2.slice/crio-92050d26829c5ab056cc252decdf29e57a5889d8c4f0b3f3a98920f152a125a9 WatchSource:0}: Error finding container 92050d26829c5ab056cc252decdf29e57a5889d8c4f0b3f3a98920f152a125a9: Status 404 returned error can't find the container with id 92050d26829c5ab056cc252decdf29e57a5889d8c4f0b3f3a98920f152a125a9 Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.078231 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.078519 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.578504867 +0000 UTC m=+91.240986597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.095129 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" podStartSLOduration=72.095112067 podStartE2EDuration="1m12.095112067s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.093672128 +0000 UTC m=+90.756153858" watchObservedRunningTime="2025-11-28 13:20:38.095112067 +0000 UTC m=+90.757593797" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.095493 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6jjwc" podStartSLOduration=72.095488897 podStartE2EDuration="1m12.095488897s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.044525575 +0000 UTC m=+90.707007305" watchObservedRunningTime="2025-11-28 13:20:38.095488897 +0000 UTC m=+90.757970627" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.097822 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cgmxb"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.153602 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fsbcb" podStartSLOduration=5.153580842 podStartE2EDuration="5.153580842s" podCreationTimestamp="2025-11-28 13:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.144593009 +0000 UTC m=+90.807074759" watchObservedRunningTime="2025-11-28 13:20:38.153580842 +0000 UTC m=+90.816062572" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.155307 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.167492 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:38 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:38 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:38 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.167537 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.181700 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.182003 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.681988692 +0000 UTC m=+91.344470412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.183560 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-z6ppk" podStartSLOduration=72.183543645 podStartE2EDuration="1m12.183543645s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.183128343 +0000 UTC m=+90.845610073" watchObservedRunningTime="2025-11-28 13:20:38.183543645 +0000 UTC m=+90.846025375" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.218308 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.286087 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.286823 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.786790624 +0000 UTC m=+91.449272354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.287090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.287499 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.787487363 +0000 UTC m=+91.449969083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.312549 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-cjg4p" podStartSLOduration=72.312531832 podStartE2EDuration="1m12.312531832s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.216069886 +0000 UTC m=+90.878551616" watchObservedRunningTime="2025-11-28 13:20:38.312531832 +0000 UTC m=+90.975013562" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.316111 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" podStartSLOduration=72.316099629 podStartE2EDuration="1m12.316099629s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.251700043 +0000 UTC m=+90.914181773" watchObservedRunningTime="2025-11-28 13:20:38.316099629 +0000 UTC m=+90.978581359" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.317076 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.343461 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lzcth"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.344895 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.363102 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.376496 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dw4lg" podStartSLOduration=72.376471435 podStartE2EDuration="1m12.376471435s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.287999407 +0000 UTC m=+90.950481137" watchObservedRunningTime="2025-11-28 13:20:38.376471435 +0000 UTC m=+91.038953165" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.388745 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.389049 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.889034256 +0000 UTC m=+91.551515976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.391577 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj"] Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.401817 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmpd6" podStartSLOduration=72.401797832 podStartE2EDuration="1m12.401797832s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.335134815 +0000 UTC m=+90.997616545" watchObservedRunningTime="2025-11-28 13:20:38.401797832 +0000 UTC m=+91.064279552" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.490174 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.490646 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:38.990628471 +0000 UTC m=+91.653110191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: W1128 13:20:38.517332 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72da77b_12ea_4b7a_bc1c_6d157c393bc0.slice/crio-e80e34d3309e0925d4f91003ed1925fba06c8c3915957639e03774c995400a5e WatchSource:0}: Error finding container e80e34d3309e0925d4f91003ed1925fba06c8c3915957639e03774c995400a5e: Status 404 returned error can't find the container with id e80e34d3309e0925d4f91003ed1925fba06c8c3915957639e03774c995400a5e Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.591274 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.591981 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.091964828 +0000 UTC m=+91.754446558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.692747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.693119 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.19310886 +0000 UTC m=+91.855590590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.793788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.794643 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.294614172 +0000 UTC m=+91.957095912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.837903 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cjg4p" event={"ID":"f0a48ada-f9d6-49a1-a109-11b05e4b757c","Type":"ContainerStarted","Data":"6d680c3a760fc41bb1fd12fa47df903ce06fea25eaecdba5a3067a9df20ab907"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.839363 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" event={"ID":"91bb382c-c2d4-4118-aee5-1df968d79d25","Type":"ContainerStarted","Data":"8a81e3e3208a1c54d251dd33a266776b21fd976937f5e61afce72ed8be575b45"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.840191 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-49gpd" event={"ID":"479245d2-229a-4cd7-b1c8-b1125d3c9ba9","Type":"ContainerStarted","Data":"cb619a1d76c014141d8166273e93b7ec426dff5a3c3b4c38f585589877a49608"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.841734 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" event={"ID":"e72da77b-12ea-4b7a-bc1c-6d157c393bc0","Type":"ContainerStarted","Data":"e80e34d3309e0925d4f91003ed1925fba06c8c3915957639e03774c995400a5e"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.843442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" event={"ID":"6fa18d98-d246-4a68-99bb-98a50a2b1e87","Type":"ContainerStarted","Data":"c734114e308dd1cb4db3660a9bd9079c185057ea6b32b80e1000714683cceffe"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.843471 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" event={"ID":"6fa18d98-d246-4a68-99bb-98a50a2b1e87","Type":"ContainerStarted","Data":"53d5c9415f66d3e669afc6c3720581067ba2c03e51ef84478d45117137b352eb"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.846930 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" event={"ID":"d3f4ad9e-7db1-42fa-94ed-232c8b908911","Type":"ContainerStarted","Data":"be80122aef95aac6292d70612eb225ee3ed4c28c7d2c512be619f439f14cfa75"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.864028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" event={"ID":"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76","Type":"ContainerStarted","Data":"9c824ea5fbc84eaf1b3a5b1c2d9780d4645a486006b9788fc1e98ebbece5a8c0"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.872017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-c6vdt" event={"ID":"5a895309-39e8-42bb-8df7-821c0c600504","Type":"ContainerStarted","Data":"d34154c6d1237d1d682daaa82e0d98f655fef8de1a10a842d3cb85a965b68960"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.874777 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerStarted","Data":"656fd1f5e0c28a80a1331711c8e141c4bc0c7bf14e7a07d5eb92c85e7eb8b88c"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.874804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerStarted","Data":"002a33093848c0e8e648e36c64ab1afa2ad41dc1420b5cbd3b5ef0d65d2daa6f"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.875803 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.881227 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dr67x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.881280 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.894306 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" event={"ID":"4a180bf3-5719-4d9b-8ebb-beb0315e7cac","Type":"ContainerStarted","Data":"fd5ab85cb1412a0ecd8e004c396797f7a8a7c11c3fc535a1608c54b3a808a5c1"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.894803 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.895724 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.896158 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" event={"ID":"46d3747c-fa48-416e-84b8-c0d7ad4394f2","Type":"ContainerStarted","Data":"92050d26829c5ab056cc252decdf29e57a5889d8c4f0b3f3a98920f152a125a9"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.898713 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" event={"ID":"e3917efc-ed36-4d2a-9af3-690506b9e302","Type":"ContainerStarted","Data":"8c2a1561c89bc859113b17f94ffa81d3046d96c89dd169fecb9254de8f609388"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.898745 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" event={"ID":"e3917efc-ed36-4d2a-9af3-690506b9e302","Type":"ContainerStarted","Data":"676f2ce33abf591674412da04e839da2a83d7781efec245e0ee1e3462c63a9e6"} Nov 28 13:20:38 crc kubenswrapper[4747]: E1128 13:20:38.906734 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.406716722 +0000 UTC m=+92.069198532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.911996 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" event={"ID":"22026c31-b3d5-4041-8835-4a0f97f456e6","Type":"ContainerStarted","Data":"9e8da239df148ae8b108b7caee3f0a13d464e2d633b32e1db9a889c2c05ab5c0"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.915544 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" event={"ID":"e6b781bc-66b7-427e-8175-bd578310559e","Type":"ContainerStarted","Data":"d43ecdfb8d662b4cdb87494d01d6af16543aa7af86d8143694936b16e497706a"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.916419 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" event={"ID":"5d776caa-4299-4b4c-b60c-2bbbe67b02a0","Type":"ContainerStarted","Data":"c1ccab4944941fd56f88c88af011fbbc256c916f20e7c8754d29af3a5df10e70"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.918010 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" event={"ID":"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f","Type":"ContainerStarted","Data":"a698a1c1fafd60a1fa7f693de98e5a04457a8d5aa54733cdc08f68c082c158e2"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.918041 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" event={"ID":"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f","Type":"ContainerStarted","Data":"6ad209ca8f97a2f5d0ace27c3a5d57637772124dc95d694515e37aba73461f62"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.918664 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.921428 4747 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-926zp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.921471 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.921936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" event={"ID":"3e9185d1-abcc-43fc-a1af-42834e838dae","Type":"ContainerStarted","Data":"c42d3e5a51f9136f2e0737b5ffe1dcce4448d29649b68145fe2a53dfe4170a89"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.924371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" event={"ID":"b16cb6fb-6e9e-440b-96df-50841a2e14d3","Type":"ContainerStarted","Data":"e3b05403761aa6dce8112b52cb59d63d768fd95b76ff4e7bd9bbe5477115a85d"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.926188 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sf6rs" event={"ID":"40766993-1700-48b0-97ab-775d3076167f","Type":"ContainerStarted","Data":"f1414d48b290f0d1ad60b7940e8ce646468a4a97532a1802c844d5facaa4b344"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.945449 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" event={"ID":"4bd443a2-dd2a-4278-854c-0d7c403b2603","Type":"ContainerStarted","Data":"4621b298af9bd53254533627a03fefa841a28b8c682b90879fc00dcbb63f3079"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.952458 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" event={"ID":"72e9d3f8-217e-4a29-a43b-e2b17998b6e1","Type":"ContainerStarted","Data":"0eadcd4f1e7bb70a61363625cc9bb0278161405b8c5c38a7cbd87a3bc94d9dd8"} Nov 28 13:20:38 crc kubenswrapper[4747]: I1128 13:20:38.952488 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" event={"ID":"72e9d3f8-217e-4a29-a43b-e2b17998b6e1","Type":"ContainerStarted","Data":"5057afb2107c97858b33f8cb43194568fa744f230726580d08e985aac49ea3c6"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:38.999942 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.001321 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.501291396 +0000 UTC m=+92.163773126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.054765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z6ppk" event={"ID":"1ff99da0-58a5-4e05-8e55-88f24bb0a962","Type":"ContainerStarted","Data":"5b9c6405a75d1c2729ad16bcfe071dd0087f1494933a8300543f44f5630ca442"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.103333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.105623 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.605610644 +0000 UTC m=+92.268092374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.106985 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" event={"ID":"7204cb20-aa3b-4484-bc0e-64155cb7f734","Type":"ContainerStarted","Data":"d9a643f1d9351f520668c8970def8fb31ce82fa951936f2a040757ae94797112"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.107038 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" event={"ID":"7204cb20-aa3b-4484-bc0e-64155cb7f734","Type":"ContainerStarted","Data":"167d99cec5afcd74a7a5cc40e9bc33532ba710fc853867ff5411ef5ca801486d"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.107079 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.167403 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:39 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:39 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:39 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.167463 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.212816 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.214347 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.714326362 +0000 UTC m=+92.376808092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.242414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" event={"ID":"871d0c50-a5cc-4de2-9566-0dc749cb24f2","Type":"ContainerStarted","Data":"162f53b1de306b0796e54fd153ae791ac9a87747a65ba9e8b5913615c74291d1"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.242461 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" event={"ID":"871d0c50-a5cc-4de2-9566-0dc749cb24f2","Type":"ContainerStarted","Data":"d0766b0b2472a1e9d034ba12cba1ee16790fea638ff99cb97626dfa96dcf3a0c"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.292111 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" event={"ID":"fe3d7e88-f7a4-428b-9415-1dd1b4047015","Type":"ContainerStarted","Data":"d00a3615247ee0d7ff7b828d82521ad33511c7a6a2a104692b0c9df0ce93d57b"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.317185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" event={"ID":"df275db8-c500-4448-8073-97e037ad189f","Type":"ContainerStarted","Data":"e00e968c512ee856aa61c3796e487e6049282c4e812f547c2f0e96653dd1d5b3"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.323680 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.323751 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" event={"ID":"df275db8-c500-4448-8073-97e037ad189f","Type":"ContainerStarted","Data":"04c4b5761a74a77c0997b0c20af993e39663e5f347d8b02ffa9ae53d3df8d756"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.322739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.324968 4747 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6292s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.325547 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.825493686 +0000 UTC m=+92.487975416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.325586 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" podUID="df275db8-c500-4448-8073-97e037ad189f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.350711 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cgmxb" event={"ID":"533808f8-dc48-431c-bfe9-6019090f4832","Type":"ContainerStarted","Data":"d8a8893db434fdd72d1e7979206432b03c6a8f8cf9895aae9039566f53671f78"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.397690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" event={"ID":"d89e4b8f-9b72-467e-bc54-c4e6421717ac","Type":"ContainerStarted","Data":"65ef96eace41e35acc874546075a8eb4cb946e210efea8db6ca381d15c691633"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.397780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" event={"ID":"d89e4b8f-9b72-467e-bc54-c4e6421717ac","Type":"ContainerStarted","Data":"ce3b434a232534819c998489ef7c6be067cc0b57c4fb998e0f3e30972c1b9922"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.399119 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.407608 4747 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-67648 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.408128 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.410928 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7rvs2" podStartSLOduration=73.410891201 podStartE2EDuration="1m13.410891201s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.398688731 +0000 UTC m=+92.061170461" watchObservedRunningTime="2025-11-28 13:20:39.410891201 +0000 UTC m=+92.073372931" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.417286 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fsbcb" event={"ID":"d1eb69af-0de8-4778-af20-7a12273e384d","Type":"ContainerStarted","Data":"6bfbf7e743d29c804803928491cc294c2c6e9a61c4b81ee95882589b67e35928"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.433969 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.434361 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:39.934327407 +0000 UTC m=+92.596809137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.446151 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" podStartSLOduration=73.446134227 podStartE2EDuration="1m13.446134227s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.444106482 +0000 UTC m=+92.106588212" watchObservedRunningTime="2025-11-28 13:20:39.446134227 +0000 UTC m=+92.108615957" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.453735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" event={"ID":"8a7ae477-ca56-4362-a334-a2915d71fdf0","Type":"ContainerStarted","Data":"5326b03841297443567e063105e877e9dd81854e8cb22f1a7f4dc280e7cfcd0f"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.505826 4747 generic.go:334] "Generic (PLEG): container finished" podID="b921d77a-e870-46f5-afe8-e071962b3881" containerID="7a81240bc894735f3aab1ddbbfe85772d71c22d89570099dae81d2c87835c9ad" exitCode=0 Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.505915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" event={"ID":"b921d77a-e870-46f5-afe8-e071962b3881","Type":"ContainerDied","Data":"7a81240bc894735f3aab1ddbbfe85772d71c22d89570099dae81d2c87835c9ad"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.517319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" event={"ID":"91f9d75e-6ca4-433a-8acd-ad2f23490d9a","Type":"ContainerStarted","Data":"d53375e11699077812a7a370480c68d238c0c2c6b85f409ade9b9f6fa6d860a2"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.527598 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nvrln" event={"ID":"de3d9a69-800a-4146-9ed2-b74e44da14ed","Type":"ContainerStarted","Data":"8d8a9beca130f498242f75a36c2586bcd60e520f2fa984d24ffa53cc3aeb9725"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.527656 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nvrln" event={"ID":"de3d9a69-800a-4146-9ed2-b74e44da14ed","Type":"ContainerStarted","Data":"b8048a9650696edca73a644757a1d2ab3d64c9f235eda24fd74e26dcb82adcd9"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.535778 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.538133 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.038117631 +0000 UTC m=+92.700599361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.543893 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" podStartSLOduration=73.543873727 podStartE2EDuration="1m13.543873727s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.506853473 +0000 UTC m=+92.169335203" watchObservedRunningTime="2025-11-28 13:20:39.543873727 +0000 UTC m=+92.206355477" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.555433 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" podStartSLOduration=73.55540982 podStartE2EDuration="1m13.55540982s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.542442368 +0000 UTC m=+92.204924118" watchObservedRunningTime="2025-11-28 13:20:39.55540982 +0000 UTC m=+92.217891550" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.555631 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" event={"ID":"750cc1d2-faa4-46ca-86a1-57720f4d922a","Type":"ContainerStarted","Data":"0b586735978155288273340f6aeaf133bfd87ed76649b8804993cb61d3af9ff7"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.559654 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" event={"ID":"5bf48580-f8cb-46a8-8d47-f1317a13eca4","Type":"ContainerStarted","Data":"467e5a1e99af94e0dec67e7dbc3b02dfc4b059f4e9d3d67f04be5b97de4b61a3"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.588674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" event={"ID":"a9581480-5267-4031-b97c-cf5e5546448e","Type":"ContainerStarted","Data":"b8128b470407cbf9547789b42b2b626251b79c0e8a48f41eb5afec28b02c333b"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.597652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" event={"ID":"1a702977-9ffe-4c8a-b5d8-c49bff5b7030","Type":"ContainerStarted","Data":"4e3380cba12e14a694f3cc82ba140cecdb4d343d6b71880ea45816387a7ca7c9"} Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.617110 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pl7n" podStartSLOduration=73.617089412 podStartE2EDuration="1m13.617089412s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.575533825 +0000 UTC m=+92.238015555" watchObservedRunningTime="2025-11-28 13:20:39.617089412 +0000 UTC m=+92.279571142" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.619550 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qk8hm" podStartSLOduration=73.619537228 podStartE2EDuration="1m13.619537228s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.612740134 +0000 UTC m=+92.275221854" watchObservedRunningTime="2025-11-28 13:20:39.619537228 +0000 UTC m=+92.282018978" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.636957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.638175 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.138159363 +0000 UTC m=+92.800641093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.647151 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" podStartSLOduration=73.647131347 podStartE2EDuration="1m13.647131347s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.645708958 +0000 UTC m=+92.308190688" watchObservedRunningTime="2025-11-28 13:20:39.647131347 +0000 UTC m=+92.309613077" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.692219 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-stzdc" podStartSLOduration=73.692191368 podStartE2EDuration="1m13.692191368s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.689949207 +0000 UTC m=+92.352430937" watchObservedRunningTime="2025-11-28 13:20:39.692191368 +0000 UTC m=+92.354673088" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.725008 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" podStartSLOduration=73.724991448 podStartE2EDuration="1m13.724991448s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.72359495 +0000 UTC m=+92.386076680" watchObservedRunningTime="2025-11-28 13:20:39.724991448 +0000 UTC m=+92.387473178" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.742000 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.743633 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.243610452 +0000 UTC m=+92.906092282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.770789 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" podStartSLOduration=73.770766869 podStartE2EDuration="1m13.770766869s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.769553876 +0000 UTC m=+92.432035606" watchObservedRunningTime="2025-11-28 13:20:39.770766869 +0000 UTC m=+92.433248599" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.806947 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" podStartSLOduration=73.806931139 podStartE2EDuration="1m13.806931139s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.804359429 +0000 UTC m=+92.466841159" watchObservedRunningTime="2025-11-28 13:20:39.806931139 +0000 UTC m=+92.469412869" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.833335 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hlq5" podStartSLOduration=73.833317415 podStartE2EDuration="1m13.833317415s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.832066471 +0000 UTC m=+92.494548201" watchObservedRunningTime="2025-11-28 13:20:39.833317415 +0000 UTC m=+92.495799145" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.845003 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.845106 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.345090744 +0000 UTC m=+93.007572474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.845356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.845745 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.345737291 +0000 UTC m=+93.008219021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.870443 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" podStartSLOduration=73.870424811 podStartE2EDuration="1m13.870424811s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.869264409 +0000 UTC m=+92.531746139" watchObservedRunningTime="2025-11-28 13:20:39.870424811 +0000 UTC m=+92.532906541" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.902123 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nvrln" podStartSLOduration=6.9021056 podStartE2EDuration="6.9021056s" podCreationTimestamp="2025-11-28 13:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.899950581 +0000 UTC m=+92.562432301" watchObservedRunningTime="2025-11-28 13:20:39.9021056 +0000 UTC m=+92.564587330" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.912951 4747 patch_prober.go:28] interesting pod/console-operator-58897d9998-nq9nc container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.913018 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" podUID="4a180bf3-5719-4d9b-8ebb-beb0315e7cac" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.960727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:39 crc kubenswrapper[4747]: E1128 13:20:39.961275 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.461253943 +0000 UTC m=+93.123735673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:39 crc kubenswrapper[4747]: I1128 13:20:39.961861 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" podStartSLOduration=73.961829829 podStartE2EDuration="1m13.961829829s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:39.953634207 +0000 UTC m=+92.616115937" watchObservedRunningTime="2025-11-28 13:20:39.961829829 +0000 UTC m=+92.624311559" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.062137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.062500 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.562487708 +0000 UTC m=+93.224969438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.080491 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" podStartSLOduration=74.080471566 podStartE2EDuration="1m14.080471566s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:40.077616948 +0000 UTC m=+92.740098678" watchObservedRunningTime="2025-11-28 13:20:40.080471566 +0000 UTC m=+92.742953286" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.165782 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.166019 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.665961284 +0000 UTC m=+93.328443014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.166374 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.166673 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.666658293 +0000 UTC m=+93.329140023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.169300 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:40 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:40 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:40 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.169347 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.267257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.267731 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.767716173 +0000 UTC m=+93.430197903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.370302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.370726 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.870713986 +0000 UTC m=+93.533195716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.470927 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.471030 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.971012285 +0000 UTC m=+93.633494015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.471439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.471740 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:40.971731244 +0000 UTC m=+93.634212974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.572543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.572887 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.072870557 +0000 UTC m=+93.735352287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.630549 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-66kvx" event={"ID":"a9581480-5267-4031-b97c-cf5e5546448e","Type":"ContainerStarted","Data":"9025ee53591f1cdf758ab7ef8990f233380bd15f0abdf050ed0b9c4977800495"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.637431 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" event={"ID":"fe3d7e88-f7a4-428b-9415-1dd1b4047015","Type":"ContainerStarted","Data":"c99edb7ca6d32204ee4b80e0c2bfad41d7b053f45727fe2e2997390f93e1d17e"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.649935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-c6vdt" event={"ID":"5a895309-39e8-42bb-8df7-821c0c600504","Type":"ContainerStarted","Data":"579f717c203f648c821530ef046b21e4b3c3f37268b0bf51726bd3f89b0804ff"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.650162 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.659069 4747 generic.go:334] "Generic (PLEG): container finished" podID="3e9185d1-abcc-43fc-a1af-42834e838dae" containerID="02095f6206cbd8cc649ee0d51b4c7f6ffc31e2ecf57516f2775401c5bb1b7ab1" exitCode=0 Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.659225 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-c6vdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.659218 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" event={"ID":"3e9185d1-abcc-43fc-a1af-42834e838dae","Type":"ContainerDied","Data":"02095f6206cbd8cc649ee0d51b4c7f6ffc31e2ecf57516f2775401c5bb1b7ab1"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.659268 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-c6vdt" podUID="5a895309-39e8-42bb-8df7-821c0c600504" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.670126 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7b75" podStartSLOduration=74.670111013 podStartE2EDuration="1m14.670111013s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:40.668474889 +0000 UTC m=+93.330956619" watchObservedRunningTime="2025-11-28 13:20:40.670111013 +0000 UTC m=+93.332592743" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.674165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.675284 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.175271723 +0000 UTC m=+93.837753453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.708571 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" event={"ID":"e3917efc-ed36-4d2a-9af3-690506b9e302","Type":"ContainerStarted","Data":"09baf15c7ed536e927ebffe5cd423cf98f519775ac5c21b4a2a0249f74aec379"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.712612 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-c6vdt" podStartSLOduration=74.712581825 podStartE2EDuration="1m14.712581825s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:40.710885719 +0000 UTC m=+93.373367449" watchObservedRunningTime="2025-11-28 13:20:40.712581825 +0000 UTC m=+93.375063555" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.742773 4747 generic.go:334] "Generic (PLEG): container finished" podID="e6b781bc-66b7-427e-8175-bd578310559e" containerID="7ca2310bba78f0f4ab75e20715581deafe2e3683de215e5033f5e9d09ee14a5b" exitCode=0 Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.742834 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" event={"ID":"e6b781bc-66b7-427e-8175-bd578310559e","Type":"ContainerDied","Data":"7ca2310bba78f0f4ab75e20715581deafe2e3683de215e5033f5e9d09ee14a5b"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.759020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" event={"ID":"e72da77b-12ea-4b7a-bc1c-6d157c393bc0","Type":"ContainerStarted","Data":"dd9660170f165bd9825593ea94f268af2adcce288a9105f2eba37c4dd7b181c5"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.759893 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.771353 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" event={"ID":"22026c31-b3d5-4041-8835-4a0f97f456e6","Type":"ContainerStarted","Data":"bf019007bd18e52936b5565a4d556847282d91d461dbdbf2ff7cb4546740dd56"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.772535 4747 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lfdbh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.772573 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" podUID="e72da77b-12ea-4b7a-bc1c-6d157c393bc0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.781832 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.783419 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.283397715 +0000 UTC m=+93.945879445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.816988 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" event={"ID":"5d776caa-4299-4b4c-b60c-2bbbe67b02a0","Type":"ContainerStarted","Data":"ea8d4f388016bc13fce276fed1a3b1690727ef95047c04077632b90e4f158bcf"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.817034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" event={"ID":"5d776caa-4299-4b4c-b60c-2bbbe67b02a0","Type":"ContainerStarted","Data":"c2bcf75ecd4a35d7533758ce91af462419f1e1a3cf8eeee28fe03c3505d19546"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.847479 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cgmxb" event={"ID":"533808f8-dc48-431c-bfe9-6019090f4832","Type":"ContainerStarted","Data":"4ec9d40fea562db66c6517f1844e5774a0f6f8116b8b90c9292c062960b65dd6"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.848376 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.882479 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" event={"ID":"1a702977-9ffe-4c8a-b5d8-c49bff5b7030","Type":"ContainerStarted","Data":"9ad24bc2878bc534a4442e3799ae3cc50ebdb7f56a7ff6268b47a4d8b633bc7e"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.882523 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" event={"ID":"1a702977-9ffe-4c8a-b5d8-c49bff5b7030","Type":"ContainerStarted","Data":"7a58ce0f3ca5d638d6067dc2a90445f59507703e8d9e2c7d0dddda80d1a15131"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.883285 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.885592 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.385580065 +0000 UTC m=+94.048061795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.903232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4fsxt" event={"ID":"750cc1d2-faa4-46ca-86a1-57720f4d922a","Type":"ContainerStarted","Data":"78fe92281543678ffc6b62aa1dd58800c12f82b9a7bdb5f17154c92b4e66a69b"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.942931 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" event={"ID":"91f9d75e-6ca4-433a-8acd-ad2f23490d9a","Type":"ContainerStarted","Data":"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9"} Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.944187 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.991400 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.991606 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.491588039 +0000 UTC m=+94.154069769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:40 crc kubenswrapper[4747]: I1128 13:20:40.992068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:40 crc kubenswrapper[4747]: E1128 13:20:40.992969 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.492959316 +0000 UTC m=+94.155441046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.009289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" event={"ID":"b921d77a-e870-46f5-afe8-e071962b3881","Type":"ContainerStarted","Data":"e745bb1f0e008c6f1604ee74e6cde7bdc5e46cdf8d18b7fb3faab450224c0e76"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.036399 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5cpjw" event={"ID":"5bf48580-f8cb-46a8-8d47-f1317a13eca4","Type":"ContainerStarted","Data":"72f53aeacf577eea0d9d5076a884f5355823ebaa086bdbdf543d51df584c38c7"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.074287 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" event={"ID":"91bb382c-c2d4-4118-aee5-1df968d79d25","Type":"ContainerStarted","Data":"4c24d18c6c449d586b015cb8f33114770c2b79cd659fcd48d4dec2926fa58658"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.092915 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.093836 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.59381306 +0000 UTC m=+94.256294780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.101132 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" event={"ID":"4bd443a2-dd2a-4278-854c-0d7c403b2603","Type":"ContainerStarted","Data":"fc30a6e172f228af877859cbd376536f76b797d79870f3abc34108da8be8fd6d"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.101189 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" event={"ID":"4bd443a2-dd2a-4278-854c-0d7c403b2603","Type":"ContainerStarted","Data":"bf3ae4d40957739b575ab56caf2a90819e99887b41875e40914f390df724fcb5"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.115405 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" event={"ID":"46d3747c-fa48-416e-84b8-c0d7ad4394f2","Type":"ContainerStarted","Data":"e29f4df44990eab6f5868f17af91d1f352fe1c880f91d9fa573a2024ca7890f1"} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.115875 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.116910 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dr67x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.116964 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.136265 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.137123 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.159766 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.160399 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6292s" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.166451 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:41 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:41 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:41 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.166503 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.197564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.210267 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.710248377 +0000 UTC m=+94.372730107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.224874 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nq9nc" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.255674 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rqm6f" podStartSLOduration=75.255659838 podStartE2EDuration="1m15.255659838s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.253113929 +0000 UTC m=+93.915595659" watchObservedRunningTime="2025-11-28 13:20:41.255659838 +0000 UTC m=+93.918141568" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.292001 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" podStartSLOduration=75.291985783 podStartE2EDuration="1m15.291985783s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.291585402 +0000 UTC m=+93.954067132" watchObservedRunningTime="2025-11-28 13:20:41.291985783 +0000 UTC m=+93.954467513" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.299525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.299906 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.799889417 +0000 UTC m=+94.462371147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.385162 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" podStartSLOduration=75.385147189 podStartE2EDuration="1m15.385147189s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.346671395 +0000 UTC m=+94.009153125" watchObservedRunningTime="2025-11-28 13:20:41.385147189 +0000 UTC m=+94.047628919" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.386442 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gxp24" podStartSLOduration=75.386435703 podStartE2EDuration="1m15.386435703s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.384628884 +0000 UTC m=+94.047110614" watchObservedRunningTime="2025-11-28 13:20:41.386435703 +0000 UTC m=+94.048917433" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.402921 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.403306 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:41.903294211 +0000 UTC m=+94.565775941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.463635 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dsc95" podStartSLOduration=75.463616446 podStartE2EDuration="1m15.463616446s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.462501356 +0000 UTC m=+94.124983086" watchObservedRunningTime="2025-11-28 13:20:41.463616446 +0000 UTC m=+94.126098176" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.494342 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ngfzt" podStartSLOduration=75.494323079 podStartE2EDuration="1m15.494323079s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.494145394 +0000 UTC m=+94.156627124" watchObservedRunningTime="2025-11-28 13:20:41.494323079 +0000 UTC m=+94.156804809" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.508512 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.508872 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.008857403 +0000 UTC m=+94.671339133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.589058 4747 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.610289 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.610570 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.11055917 +0000 UTC m=+94.773040900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.668361 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cgmxb" podStartSLOduration=8.668341357 podStartE2EDuration="8.668341357s" podCreationTimestamp="2025-11-28 13:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.627592242 +0000 UTC m=+94.290073972" watchObservedRunningTime="2025-11-28 13:20:41.668341357 +0000 UTC m=+94.330823087" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.712690 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.712789 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.212772961 +0000 UTC m=+94.875254691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.713108 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.713390 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.213383808 +0000 UTC m=+94.875865538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.740181 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ksgtz" podStartSLOduration=75.740164164 podStartE2EDuration="1m15.740164164s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.739095785 +0000 UTC m=+94.401577515" watchObservedRunningTime="2025-11-28 13:20:41.740164164 +0000 UTC m=+94.402645894" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.740725 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" podStartSLOduration=75.740719079 podStartE2EDuration="1m15.740719079s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.713776509 +0000 UTC m=+94.376258229" watchObservedRunningTime="2025-11-28 13:20:41.740719079 +0000 UTC m=+94.403200809" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.766289 4747 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-28T13:20:41.589088728Z","Handler":null,"Name":""} Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.818288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.818481 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.318454597 +0000 UTC m=+94.980936327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.818611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:41 crc kubenswrapper[4747]: E1128 13:20:41.818942 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.31893429 +0000 UTC m=+94.981416020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jms9b" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.847028 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.848423 4747 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.848455 4747 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.880699 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nxfwm" podStartSLOduration=75.880680054 podStartE2EDuration="1m15.880680054s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:41.788221667 +0000 UTC m=+94.450703397" watchObservedRunningTime="2025-11-28 13:20:41.880680054 +0000 UTC m=+94.543161784" Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.920130 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:41 crc kubenswrapper[4747]: I1128 13:20:41.931154 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.022717 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.027910 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.028079 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.121023 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" event={"ID":"b921d77a-e870-46f5-afe8-e071962b3881","Type":"ContainerStarted","Data":"c7f62327a13d754daa90e024c8e68a8681ccf5d7263db4c8117e3b55b9afd9fd"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.123327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" event={"ID":"871d0c50-a5cc-4de2-9566-0dc749cb24f2","Type":"ContainerStarted","Data":"ffdbede32dde6a1649e2e4661c76c38d651b6b0e8696c30ff37ec50f41212d99"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.123378 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" event={"ID":"871d0c50-a5cc-4de2-9566-0dc749cb24f2","Type":"ContainerStarted","Data":"40e9eed69df60d6226ade7281d95b3d067a00c12fae53725fe206b04cb70531a"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.124953 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" event={"ID":"3e9185d1-abcc-43fc-a1af-42834e838dae","Type":"ContainerStarted","Data":"6c106ad16b06daf06532d044bf614d8cdc81d8c828e11cc29adb4d20d76e90dd"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.125122 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.126486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cgmxb" event={"ID":"533808f8-dc48-431c-bfe9-6019090f4832","Type":"ContainerStarted","Data":"388d7769335608df90e92112a517ded9d9365c26af00ed3060c65cc0c7f5d4f4"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.128027 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lzcth" event={"ID":"22026c31-b3d5-4041-8835-4a0f97f456e6","Type":"ContainerStarted","Data":"65533e90634704952adebfe256e3f0701a028f6162a049d6432311b934d5ddaf"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.129758 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" event={"ID":"e6b781bc-66b7-427e-8175-bd578310559e","Type":"ContainerStarted","Data":"83cbc12e7204357f68ededfe6f0fb83dc0c1070559725d0a6d3df75fefb5e73b"} Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.130687 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-c6vdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.130734 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-c6vdt" podUID="5a895309-39e8-42bb-8df7-821c0c600504" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.130888 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dr67x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.131025 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.163640 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:42 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:42 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:42 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.163711 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.168813 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" podStartSLOduration=76.168782375 podStartE2EDuration="1m16.168782375s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:42.165728732 +0000 UTC m=+94.828210462" watchObservedRunningTime="2025-11-28 13:20:42.168782375 +0000 UTC m=+94.831264115" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.174340 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lfdbh" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.194975 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.196712 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.201979 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" podStartSLOduration=76.201956285 podStartE2EDuration="1m16.201956285s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:42.19774103 +0000 UTC m=+94.860222760" watchObservedRunningTime="2025-11-28 13:20:42.201956285 +0000 UTC m=+94.864438015" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.203447 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.220317 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.225944 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.226223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8x6\" (UniqueName: \"kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.226298 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.251677 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" podStartSLOduration=76.251660632 podStartE2EDuration="1m16.251660632s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:42.251058896 +0000 UTC m=+94.913540636" watchObservedRunningTime="2025-11-28 13:20:42.251660632 +0000 UTC m=+94.914142362" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.274020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jms9b\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.327889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.327963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8x6\" (UniqueName: \"kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.327989 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.328439 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.328650 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.353921 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8x6\" (UniqueName: \"kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6\") pod \"community-operators-6d995\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.377224 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.378164 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.391775 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.399957 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.429391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.429441 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.429477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72884\" (UniqueName: \"kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.530721 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.530797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.530863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72884\" (UniqueName: \"kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.531381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.531520 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.551506 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.573219 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.573677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72884\" (UniqueName: \"kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884\") pod \"certified-operators-7khwx\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.584744 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.585855 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.614313 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.632776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhtd\" (UniqueName: \"kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.632846 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.632957 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.694978 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.734192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.734340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.734359 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhtd\" (UniqueName: \"kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.734797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.734812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.771041 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhtd\" (UniqueName: \"kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd\") pod \"community-operators-4g45d\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.795679 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.806521 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.835191 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kblrx\" (UniqueName: \"kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.835271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.835381 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.859411 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.907044 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.936896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.937324 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kblrx\" (UniqueName: \"kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.937387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.937883 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.938121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:42 crc kubenswrapper[4747]: I1128 13:20:42.995031 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kblrx\" (UniqueName: \"kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx\") pod \"certified-operators-jz5h9\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.066223 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.144445 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.164108 4747 generic.go:334] "Generic (PLEG): container finished" podID="7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" containerID="9c824ea5fbc84eaf1b3a5b1c2d9780d4645a486006b9788fc1e98ebbece5a8c0" exitCode=0 Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.164182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" event={"ID":"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76","Type":"ContainerDied","Data":"9c824ea5fbc84eaf1b3a5b1c2d9780d4645a486006b9788fc1e98ebbece5a8c0"} Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.176391 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:43 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:43 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:43 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.176443 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.192556 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.208983 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" event={"ID":"871d0c50-a5cc-4de2-9566-0dc749cb24f2","Type":"ContainerStarted","Data":"7515dcf8313bade2ab4f7a03057bc0f9e01e43ac8f3a17c8b201222668389c4b"} Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.248963 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.265905 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wqrlt" podStartSLOduration=10.265889821 podStartE2EDuration="10.265889821s" podCreationTimestamp="2025-11-28 13:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:43.263786674 +0000 UTC m=+95.926268404" watchObservedRunningTime="2025-11-28 13:20:43.265889821 +0000 UTC m=+95.928371541" Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.450511 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.625085 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:20:43 crc kubenswrapper[4747]: I1128 13:20:43.651402 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 28 13:20:43 crc kubenswrapper[4747]: W1128 13:20:43.672004 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf857c545_4773_4fc8_87b3_37367dd71e20.slice/crio-fc3342d47658b56131e64c646c7c20b0e2102b04fadcc21187691001d1ae0ce1 WatchSource:0}: Error finding container fc3342d47658b56131e64c646c7c20b0e2102b04fadcc21187691001d1ae0ce1: Status 404 returned error can't find the container with id fc3342d47658b56131e64c646c7c20b0e2102b04fadcc21187691001d1ae0ce1 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.157261 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:44 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:44 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:44 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.157594 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.157603 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.158606 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.160353 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.172299 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.213659 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.214951 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.217245 4747 generic.go:334] "Generic (PLEG): container finished" podID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerID="a4c56382690751d093e4d64086f0fe726471d7c917aa6adb347dd27a1c046368" exitCode=0 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.217403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerDied","Data":"a4c56382690751d093e4d64086f0fe726471d7c917aa6adb347dd27a1c046368"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.217437 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerStarted","Data":"2c63f327222a11bd788fe082f336bf1722d28cac1bf3787a4f43e9dcc1e8ec0b"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.218463 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.219758 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.220647 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.221238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" event={"ID":"a0c345d3-2efb-458e-9b68-52c46be2279c","Type":"ContainerStarted","Data":"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.221272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" event={"ID":"a0c345d3-2efb-458e-9b68-52c46be2279c","Type":"ContainerStarted","Data":"1b5b92b320762faf392e66825bdf2e97f6cec91b78060a72937f964de80b5dfb"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.221935 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.222144 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.226599 4747 generic.go:334] "Generic (PLEG): container finished" podID="f857c545-4773-4fc8-87b3-37367dd71e20" containerID="529d181420112a5a2a446c73d5115f376f0ee04fcdf8aa53dde0e76fa27b9026" exitCode=0 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.226675 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerDied","Data":"529d181420112a5a2a446c73d5115f376f0ee04fcdf8aa53dde0e76fa27b9026"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.226697 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerStarted","Data":"fc3342d47658b56131e64c646c7c20b0e2102b04fadcc21187691001d1ae0ce1"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.230133 4747 generic.go:334] "Generic (PLEG): container finished" podID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerID="70e548b1b73260c2bd83c93a869b83248270cf109bfe490da1258775c7555646" exitCode=0 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.230224 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerDied","Data":"70e548b1b73260c2bd83c93a869b83248270cf109bfe490da1258775c7555646"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.230259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerStarted","Data":"b390773bb6d7add095367c7c81c9abb37fed7ce856d962b1238143b1a16f1473"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.232366 4747 generic.go:334] "Generic (PLEG): container finished" podID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerID="8c659bddf5231181c63ec8301a205fd500746c303500a0060a0375200b11f9c7" exitCode=0 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.232414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerDied","Data":"8c659bddf5231181c63ec8301a205fd500746c303500a0060a0375200b11f9c7"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.232451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerStarted","Data":"4d0de0425edc1cc1601032947ad347b993428692503d9653a82ced158f849ad2"} Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.263699 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.263752 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8gb\" (UniqueName: \"kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.263853 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.263937 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.263966 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.330873 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" podStartSLOduration=78.330852605 podStartE2EDuration="1m18.330852605s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:44.311268934 +0000 UTC m=+96.973750684" watchObservedRunningTime="2025-11-28 13:20:44.330852605 +0000 UTC m=+96.993334335" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.364911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.364954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8gb\" (UniqueName: \"kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.365015 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.365069 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.365095 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.365575 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.366040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.366479 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.388658 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.405772 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8gb\" (UniqueName: \"kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb\") pod \"redhat-marketplace-86pgk\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.478877 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.517047 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.544076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.562954 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:20:44 crc kubenswrapper[4747]: E1128 13:20:44.563581 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" containerName="collect-profiles" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.563608 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" containerName="collect-profiles" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.563793 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" containerName="collect-profiles" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.565008 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.566794 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8plq7\" (UniqueName: \"kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7\") pod \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.566908 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume\") pod \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.567005 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume\") pod \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\" (UID: \"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76\") " Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.567378 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.568788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume" (OuterVolumeSpecName: "config-volume") pod "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" (UID: "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.572954 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd5d3d3-f6f3-48da-8e99-2e16fd81582f-metrics-certs\") pod \"network-metrics-daemon-jpqkc\" (UID: \"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f\") " pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.581446 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.583556 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7" (OuterVolumeSpecName: "kube-api-access-8plq7") pod "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" (UID: "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76"). InnerVolumeSpecName "kube-api-access-8plq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.583727 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" (UID: "7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669239 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrpl\" (UniqueName: \"kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669404 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669419 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8plq7\" (UniqueName: \"kubernetes.io/projected/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-kube-api-access-8plq7\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.669434 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.702511 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:20:44 crc kubenswrapper[4747]: W1128 13:20:44.710121 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8a42c3_62aa_4542_b86e_171e124c81f4.slice/crio-cf396f90350b7da06e1c9b8e7dc71438059298e0191bf8084b410e092c656d07 WatchSource:0}: Error finding container cf396f90350b7da06e1c9b8e7dc71438059298e0191bf8084b410e092c656d07: Status 404 returned error can't find the container with id cf396f90350b7da06e1c9b8e7dc71438059298e0191bf8084b410e092c656d07 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.763579 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.769764 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.769796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.769825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxrpl\" (UniqueName: \"kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.770290 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.774269 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.782430 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpqkc" Nov 28 13:20:44 crc kubenswrapper[4747]: W1128 13:20:44.784316 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod00c8bfec_bbc7_4954_89fc_c0d81c9c8a0c.slice/crio-08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15 WatchSource:0}: Error finding container 08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15: Status 404 returned error can't find the container with id 08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15 Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.790163 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxrpl\" (UniqueName: \"kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl\") pod \"redhat-marketplace-q9w47\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:44 crc kubenswrapper[4747]: I1128 13:20:44.906255 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.013181 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpqkc"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.055841 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.057285 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.057391 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.063481 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.068442 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.083591 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.084612 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.167452 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.167681 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:45 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:45 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:45 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.167709 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.193046 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.193145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.193251 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.228288 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.246853 4747 generic.go:334] "Generic (PLEG): container finished" podID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerID="a5748d6607e57a464a6a5ab646258e85ea62948b4fd124361109b479df4dfba1" exitCode=0 Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.247009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerDied","Data":"a5748d6607e57a464a6a5ab646258e85ea62948b4fd124361109b479df4dfba1"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.247040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerStarted","Data":"cf396f90350b7da06e1c9b8e7dc71438059298e0191bf8084b410e092c656d07"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.258361 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" event={"ID":"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f","Type":"ContainerStarted","Data":"8a0f4eea2aa467ac63c454b3aabf39265e112082dda11f24cd20000899f8c7dd"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.263018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" event={"ID":"7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76","Type":"ContainerDied","Data":"c73e005fc8ea79899da802b4ea823c280664385e50f49a334bc1b34634d17523"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.263060 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c73e005fc8ea79899da802b4ea823c280664385e50f49a334bc1b34634d17523" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.263085 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.270534 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerStarted","Data":"f165400fa37a365910360d8af1bf1e9f3aa26f48c0a5e31bbbc351871dad878c"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.273291 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c","Type":"ContainerStarted","Data":"dd7b61d6bc7474e224eccf3828f06f9ddb89c50139f313d9c50a30da94e73c80"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.273351 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c","Type":"ContainerStarted","Data":"08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15"} Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.294153 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.2941355620000001 podStartE2EDuration="1.294135562s" podCreationTimestamp="2025-11-28 13:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:45.293659749 +0000 UTC m=+97.956141499" watchObservedRunningTime="2025-11-28 13:20:45.294135562 +0000 UTC m=+97.956617292" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.383968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.478987 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dbpj" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.553014 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.554079 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.556003 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.562736 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.618843 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.618908 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79bz9\" (UniqueName: \"kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.618938 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.665312 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.665353 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.667140 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.667311 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.721509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.721854 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79bz9\" (UniqueName: \"kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.721878 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.725255 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.725570 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.745646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79bz9\" (UniqueName: \"kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9\") pod \"redhat-operators-zvsv5\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.809707 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.809756 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.825172 4747 patch_prober.go:28] interesting pod/console-f9d7485db-cjg4p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.825266 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cjg4p" podUID="f0a48ada-f9d6-49a1-a109-11b05e4b757c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.868940 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.874872 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.960951 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.963680 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:45 crc kubenswrapper[4747]: I1128 13:20:45.978605 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.030178 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvn7\" (UniqueName: \"kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.030541 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.030695 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.132224 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.132577 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.132660 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvn7\" (UniqueName: \"kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.132890 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.133338 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.156072 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-c6vdt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.156117 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-c6vdt" podUID="5a895309-39e8-42bb-8df7-821c0c600504" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.156230 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-c6vdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.156258 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-c6vdt" podUID="5a895309-39e8-42bb-8df7-821c0c600504" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.156311 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.157731 4747 patch_prober.go:28] interesting pod/router-default-5444994796-z6ppk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:46 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Nov 28 13:20:46 crc kubenswrapper[4747]: [+]process-running ok Nov 28 13:20:46 crc kubenswrapper[4747]: healthz check failed Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.157753 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z6ppk" podUID="1ff99da0-58a5-4e05-8e55-88f24bb0a962" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.166834 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvn7\" (UniqueName: \"kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7\") pod \"redhat-operators-brc8n\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.229920 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.241581 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.284921 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.285002 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.289238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerStarted","Data":"7d16f3e9d6c81123f8e593204b7139d6e896b4d7238a7984d1f35c36cbac3385"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.292841 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.293878 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" event={"ID":"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f","Type":"ContainerStarted","Data":"c45738f3d248468e0f1604edf2a2549487a5b58199886a686aba2c6dc0f32214"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.293917 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpqkc" event={"ID":"8dd5d3d3-f6f3-48da-8e99-2e16fd81582f","Type":"ContainerStarted","Data":"b0113cf1ae753942efb9db3a0049cee988630a6b82c44060572cb543396dc18a"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.303720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d34970d2-14b1-445d-8fdc-41e36122c21e","Type":"ContainerStarted","Data":"fa8af7c6dcd419d99f20cee1e5b41323a6f48b8d34cc460a1b01bdd80b73f26e"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.303773 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d34970d2-14b1-445d-8fdc-41e36122c21e","Type":"ContainerStarted","Data":"87e7ebebfd579051937342b5c2d46e2cecca7bf54f4bf6b9f55cddbefcf5db13"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.306027 4747 generic.go:334] "Generic (PLEG): container finished" podID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerID="f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402" exitCode=0 Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.306078 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerDied","Data":"f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.319299 4747 generic.go:334] "Generic (PLEG): container finished" podID="00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" containerID="dd7b61d6bc7474e224eccf3828f06f9ddb89c50139f313d9c50a30da94e73c80" exitCode=0 Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.326073 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c","Type":"ContainerDied","Data":"dd7b61d6bc7474e224eccf3828f06f9ddb89c50139f313d9c50a30da94e73c80"} Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.326350 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jdt8x" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.330477 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.331448 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.3314262559999999 podStartE2EDuration="1.331426256s" podCreationTimestamp="2025-11-28 13:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:46.32789812 +0000 UTC m=+98.990379850" watchObservedRunningTime="2025-11-28 13:20:46.331426256 +0000 UTC m=+98.993907986" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.351182 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jpqkc" podStartSLOduration=80.35115513 podStartE2EDuration="1m20.35115513s" podCreationTimestamp="2025-11-28 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:46.343579925 +0000 UTC m=+99.006061655" watchObservedRunningTime="2025-11-28 13:20:46.35115513 +0000 UTC m=+99.013636870" Nov 28 13:20:46 crc kubenswrapper[4747]: I1128 13:20:46.720465 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:20:46 crc kubenswrapper[4747]: W1128 13:20:46.779876 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7759307e_b584_4d97_af0e_d3aebe6f9f08.slice/crio-97d9b27528d807ee0d36c298c38623988d53eb2648356891584b34978c943d27 WatchSource:0}: Error finding container 97d9b27528d807ee0d36c298c38623988d53eb2648356891584b34978c943d27: Status 404 returned error can't find the container with id 97d9b27528d807ee0d36c298c38623988d53eb2648356891584b34978c943d27 Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.156960 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.159638 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-z6ppk" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.344463 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerID="de894167c4949aaf76f25ca2262846d4269831a558c3ec64ff9935e04a86ffbb" exitCode=0 Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.344567 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerDied","Data":"de894167c4949aaf76f25ca2262846d4269831a558c3ec64ff9935e04a86ffbb"} Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.351517 4747 generic.go:334] "Generic (PLEG): container finished" podID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerID="769fab8f21db10d0540b9548153a1151f9b8c335a7cb39a1447640d6b57e002a" exitCode=0 Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.351591 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerDied","Data":"769fab8f21db10d0540b9548153a1151f9b8c335a7cb39a1447640d6b57e002a"} Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.351617 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerStarted","Data":"97d9b27528d807ee0d36c298c38623988d53eb2648356891584b34978c943d27"} Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.355802 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d34970d2-14b1-445d-8fdc-41e36122c21e","Type":"ContainerDied","Data":"fa8af7c6dcd419d99f20cee1e5b41323a6f48b8d34cc460a1b01bdd80b73f26e"} Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.355493 4747 generic.go:334] "Generic (PLEG): container finished" podID="d34970d2-14b1-445d-8fdc-41e36122c21e" containerID="fa8af7c6dcd419d99f20cee1e5b41323a6f48b8d34cc460a1b01bdd80b73f26e" exitCode=0 Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.367670 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w8vhf" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.777321 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.868490 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access\") pod \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.868551 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir\") pod \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\" (UID: \"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c\") " Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.869109 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" (UID: "00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.876762 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" (UID: "00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.973573 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:47 crc kubenswrapper[4747]: I1128 13:20:47.973601 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.374874 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.381996 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c","Type":"ContainerDied","Data":"08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15"} Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.382818 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d1b72d0b7e013e5fb78c9b1236901027de1a361c8aac6ea4c9014e220a3a15" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.664881 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.789224 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir\") pod \"d34970d2-14b1-445d-8fdc-41e36122c21e\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.789284 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access\") pod \"d34970d2-14b1-445d-8fdc-41e36122c21e\" (UID: \"d34970d2-14b1-445d-8fdc-41e36122c21e\") " Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.789386 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d34970d2-14b1-445d-8fdc-41e36122c21e" (UID: "d34970d2-14b1-445d-8fdc-41e36122c21e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.789671 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d34970d2-14b1-445d-8fdc-41e36122c21e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.800475 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d34970d2-14b1-445d-8fdc-41e36122c21e" (UID: "d34970d2-14b1-445d-8fdc-41e36122c21e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:48 crc kubenswrapper[4747]: I1128 13:20:48.891174 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d34970d2-14b1-445d-8fdc-41e36122c21e-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:49 crc kubenswrapper[4747]: I1128 13:20:49.385169 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d34970d2-14b1-445d-8fdc-41e36122c21e","Type":"ContainerDied","Data":"87e7ebebfd579051937342b5c2d46e2cecca7bf54f4bf6b9f55cddbefcf5db13"} Nov 28 13:20:49 crc kubenswrapper[4747]: I1128 13:20:49.385267 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e7ebebfd579051937342b5c2d46e2cecca7bf54f4bf6b9f55cddbefcf5db13" Nov 28 13:20:49 crc kubenswrapper[4747]: I1128 13:20:49.385266 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:51 crc kubenswrapper[4747]: I1128 13:20:51.340116 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cgmxb" Nov 28 13:20:55 crc kubenswrapper[4747]: I1128 13:20:55.823855 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:55 crc kubenswrapper[4747]: I1128 13:20:55.827396 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-cjg4p" Nov 28 13:20:56 crc kubenswrapper[4747]: I1128 13:20:56.159588 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-c6vdt" Nov 28 13:21:02 crc kubenswrapper[4747]: I1128 13:21:02.581359 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:21:15 crc kubenswrapper[4747]: I1128 13:21:15.836687 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bf44" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.139263 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.139673 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxrpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-q9w47_openshift-marketplace(9319de37-d943-4f15-aec7-c2f0bf0ca64d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.140854 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-q9w47" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.178190 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.180032 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t8gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-86pgk_openshift-marketplace(0a8a42c3-62aa-4542-b86e-171e124c81f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.181165 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-86pgk" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.608557 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerStarted","Data":"923986b42d2cdd47a78ad54da4072a239bcd93814641f752ae7722c2b9c4d3d4"} Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.609748 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerStarted","Data":"5533dfd7e753a2af46fbc8912145eb0e0e40512d10146bb7ae1f5563e47142d1"} Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.612765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerStarted","Data":"41bf2be995ae1573a23676347dc6b6cc7bdd7786921601a6784d7aa45309e2b0"} Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.614155 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerStarted","Data":"6cd42d7afc5412e9d018e3743be7dcc069f09e3f1b529525da639f804e8b019e"} Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.615593 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerStarted","Data":"7beb0ba582e6bb3b574680be7f5c3b07f979280bdf997d2eec8c1474ed7d1fdb"} Nov 28 13:21:16 crc kubenswrapper[4747]: I1128 13:21:16.616888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerStarted","Data":"8d165f362e4e6b6740474da36e38658079514efae39e4f65f31b5b54af363d33"} Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.618215 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-86pgk" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" Nov 28 13:21:16 crc kubenswrapper[4747]: E1128 13:21:16.618880 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-q9w47" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.628976 4747 generic.go:334] "Generic (PLEG): container finished" podID="f857c545-4773-4fc8-87b3-37367dd71e20" containerID="923986b42d2cdd47a78ad54da4072a239bcd93814641f752ae7722c2b9c4d3d4" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.629099 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerDied","Data":"923986b42d2cdd47a78ad54da4072a239bcd93814641f752ae7722c2b9c4d3d4"} Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.633663 4747 generic.go:334] "Generic (PLEG): container finished" podID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerID="41bf2be995ae1573a23676347dc6b6cc7bdd7786921601a6784d7aa45309e2b0" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.633792 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerDied","Data":"41bf2be995ae1573a23676347dc6b6cc7bdd7786921601a6784d7aa45309e2b0"} Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.638246 4747 generic.go:334] "Generic (PLEG): container finished" podID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerID="5533dfd7e753a2af46fbc8912145eb0e0e40512d10146bb7ae1f5563e47142d1" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.638385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerDied","Data":"5533dfd7e753a2af46fbc8912145eb0e0e40512d10146bb7ae1f5563e47142d1"} Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.641789 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerID="6cd42d7afc5412e9d018e3743be7dcc069f09e3f1b529525da639f804e8b019e" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.645548 4747 generic.go:334] "Generic (PLEG): container finished" podID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerID="7beb0ba582e6bb3b574680be7f5c3b07f979280bdf997d2eec8c1474ed7d1fdb" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.649569 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerDied","Data":"6cd42d7afc5412e9d018e3743be7dcc069f09e3f1b529525da639f804e8b019e"} Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.649604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerDied","Data":"7beb0ba582e6bb3b574680be7f5c3b07f979280bdf997d2eec8c1474ed7d1fdb"} Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.652364 4747 generic.go:334] "Generic (PLEG): container finished" podID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerID="8d165f362e4e6b6740474da36e38658079514efae39e4f65f31b5b54af363d33" exitCode=0 Nov 28 13:21:17 crc kubenswrapper[4747]: I1128 13:21:17.652443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerDied","Data":"8d165f362e4e6b6740474da36e38658079514efae39e4f65f31b5b54af363d33"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.661480 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerStarted","Data":"cd9084b0d713eda79fa288e5c48b0d0ef50840c78894fad81451d4686129be55"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.666277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerStarted","Data":"3abca98eb9821fca078ad753aab90906f9faf0a48ef7462aeac478011d80db37"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.668837 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerStarted","Data":"771d9959d3be8e3690dd793a66795039910cd97cce616c8b69c6e047ef648404"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.672305 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerStarted","Data":"907f61ef0c548664a6655890d1f078da6fa91d84c74e7cbdbbb7165d1d45bfd4"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.675123 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerStarted","Data":"8ac6e215e0585923eab640f34afcd9c8f2017e167b3a48ec06f6542870e85ed9"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.677512 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerStarted","Data":"b1012297d216e8c71d84d7fe1f462a5609b2ae80034ebb466d4157e39b98a5b5"} Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.685999 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6d995" podStartSLOduration=2.7729038409999998 podStartE2EDuration="36.685982208s" podCreationTimestamp="2025-11-28 13:20:42 +0000 UTC" firstStartedPulling="2025-11-28 13:20:44.219493096 +0000 UTC m=+96.881974826" lastFinishedPulling="2025-11-28 13:21:18.132571453 +0000 UTC m=+130.795053193" observedRunningTime="2025-11-28 13:21:18.681226389 +0000 UTC m=+131.343708119" watchObservedRunningTime="2025-11-28 13:21:18.685982208 +0000 UTC m=+131.348463938" Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.702899 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4g45d" podStartSLOduration=2.739769232 podStartE2EDuration="36.702875396s" podCreationTimestamp="2025-11-28 13:20:42 +0000 UTC" firstStartedPulling="2025-11-28 13:20:44.231698117 +0000 UTC m=+96.894179847" lastFinishedPulling="2025-11-28 13:21:18.194804281 +0000 UTC m=+130.857286011" observedRunningTime="2025-11-28 13:21:18.70193781 +0000 UTC m=+131.364419540" watchObservedRunningTime="2025-11-28 13:21:18.702875396 +0000 UTC m=+131.365357126" Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.719589 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jz5h9" podStartSLOduration=2.6058344399999998 podStartE2EDuration="36.719568328s" podCreationTimestamp="2025-11-28 13:20:42 +0000 UTC" firstStartedPulling="2025-11-28 13:20:44.228445638 +0000 UTC m=+96.890927368" lastFinishedPulling="2025-11-28 13:21:18.342179516 +0000 UTC m=+131.004661256" observedRunningTime="2025-11-28 13:21:18.717259945 +0000 UTC m=+131.379741675" watchObservedRunningTime="2025-11-28 13:21:18.719568328 +0000 UTC m=+131.382050058" Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.732253 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvsv5" podStartSLOduration=2.9238413789999997 podStartE2EDuration="33.732221401s" podCreationTimestamp="2025-11-28 13:20:45 +0000 UTC" firstStartedPulling="2025-11-28 13:20:47.349465127 +0000 UTC m=+100.011946857" lastFinishedPulling="2025-11-28 13:21:18.157845149 +0000 UTC m=+130.820326879" observedRunningTime="2025-11-28 13:21:18.73216665 +0000 UTC m=+131.394648380" watchObservedRunningTime="2025-11-28 13:21:18.732221401 +0000 UTC m=+131.394703131" Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.773723 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7khwx" podStartSLOduration=2.816553014 podStartE2EDuration="36.773691596s" podCreationTimestamp="2025-11-28 13:20:42 +0000 UTC" firstStartedPulling="2025-11-28 13:20:44.236478276 +0000 UTC m=+96.898960006" lastFinishedPulling="2025-11-28 13:21:18.193616858 +0000 UTC m=+130.856098588" observedRunningTime="2025-11-28 13:21:18.752060169 +0000 UTC m=+131.414541909" watchObservedRunningTime="2025-11-28 13:21:18.773691596 +0000 UTC m=+131.436173326" Nov 28 13:21:18 crc kubenswrapper[4747]: I1128 13:21:18.776578 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-brc8n" podStartSLOduration=2.7582338010000003 podStartE2EDuration="33.776570294s" podCreationTimestamp="2025-11-28 13:20:45 +0000 UTC" firstStartedPulling="2025-11-28 13:20:47.353458426 +0000 UTC m=+100.015940156" lastFinishedPulling="2025-11-28 13:21:18.371794919 +0000 UTC m=+131.034276649" observedRunningTime="2025-11-28 13:21:18.771750323 +0000 UTC m=+131.434232053" watchObservedRunningTime="2025-11-28 13:21:18.776570294 +0000 UTC m=+131.439052024" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.552284 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.552582 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.618560 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.695886 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.695919 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.738671 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.908761 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.908814 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:22 crc kubenswrapper[4747]: I1128 13:21:22.966170 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.146037 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.146093 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.186526 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.742285 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.750460 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:23 crc kubenswrapper[4747]: I1128 13:21:23.756966 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:24 crc kubenswrapper[4747]: I1128 13:21:24.969552 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.557323 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:21:25 crc kubenswrapper[4747]: E1128 13:21:25.557597 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.557613 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: E1128 13:21:25.557655 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34970d2-14b1-445d-8fdc-41e36122c21e" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.557664 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34970d2-14b1-445d-8fdc-41e36122c21e" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.557802 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c8bfec-bbc7-4954-89fc-c0d81c9c8a0c" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.557820 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34970d2-14b1-445d-8fdc-41e36122c21e" containerName="pruner" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.559468 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.562676 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.562690 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.572332 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.577700 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.716310 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jz5h9" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="registry-server" containerID="cri-o://3abca98eb9821fca078ad753aab90906f9faf0a48ef7462aeac478011d80db37" gracePeriod=2 Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.716589 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4g45d" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="registry-server" containerID="cri-o://771d9959d3be8e3690dd793a66795039910cd97cce616c8b69c6e047ef648404" gracePeriod=2 Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.742224 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.742513 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.844039 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.844142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.844165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.870368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.876155 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.876197 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.892274 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:25 crc kubenswrapper[4747]: I1128 13:21:25.944723 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.138220 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.331649 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.332016 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.373797 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.723401 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3209c35c-09ff-4df0-911d-ec6ca70edd81","Type":"ContainerStarted","Data":"a20e48fcc6583e9b37d8cf254c7258250572f35dae902218079140e2a95ee438"} Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.760801 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:21:26 crc kubenswrapper[4747]: I1128 13:21:26.763774 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.730871 4747 generic.go:334] "Generic (PLEG): container finished" podID="f857c545-4773-4fc8-87b3-37367dd71e20" containerID="3abca98eb9821fca078ad753aab90906f9faf0a48ef7462aeac478011d80db37" exitCode=0 Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.730924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerDied","Data":"3abca98eb9821fca078ad753aab90906f9faf0a48ef7462aeac478011d80db37"} Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.734063 4747 generic.go:334] "Generic (PLEG): container finished" podID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerID="771d9959d3be8e3690dd793a66795039910cd97cce616c8b69c6e047ef648404" exitCode=0 Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.734135 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerDied","Data":"771d9959d3be8e3690dd793a66795039910cd97cce616c8b69c6e047ef648404"} Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.738269 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3209c35c-09ff-4df0-911d-ec6ca70edd81","Type":"ContainerStarted","Data":"2c7d422fa680a214d1d1a1ed3b4fff049b5ce3a7f1610bf0bb7793665f402750"} Nov 28 13:21:27 crc kubenswrapper[4747]: I1128 13:21:27.749568 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.74954954 podStartE2EDuration="2.74954954s" podCreationTimestamp="2025-11-28 13:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:21:27.748646088 +0000 UTC m=+140.411127848" watchObservedRunningTime="2025-11-28 13:21:27.74954954 +0000 UTC m=+140.412031270" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.661632 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.751557 4747 generic.go:334] "Generic (PLEG): container finished" podID="3209c35c-09ff-4df0-911d-ec6ca70edd81" containerID="2c7d422fa680a214d1d1a1ed3b4fff049b5ce3a7f1610bf0bb7793665f402750" exitCode=0 Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.751618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3209c35c-09ff-4df0-911d-ec6ca70edd81","Type":"ContainerDied","Data":"2c7d422fa680a214d1d1a1ed3b4fff049b5ce3a7f1610bf0bb7793665f402750"} Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.753522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz5h9" event={"ID":"f857c545-4773-4fc8-87b3-37367dd71e20","Type":"ContainerDied","Data":"fc3342d47658b56131e64c646c7c20b0e2102b04fadcc21187691001d1ae0ce1"} Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.753554 4747 scope.go:117] "RemoveContainer" containerID="3abca98eb9821fca078ad753aab90906f9faf0a48ef7462aeac478011d80db37" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.753569 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz5h9" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.766073 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.770139 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-brc8n" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="registry-server" containerID="cri-o://b1012297d216e8c71d84d7fe1f462a5609b2ae80034ebb466d4157e39b98a5b5" gracePeriod=2 Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.782639 4747 scope.go:117] "RemoveContainer" containerID="923986b42d2cdd47a78ad54da4072a239bcd93814641f752ae7722c2b9c4d3d4" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.800043 4747 scope.go:117] "RemoveContainer" containerID="529d181420112a5a2a446c73d5115f376f0ee04fcdf8aa53dde0e76fa27b9026" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.808084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kblrx\" (UniqueName: \"kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx\") pod \"f857c545-4773-4fc8-87b3-37367dd71e20\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.808147 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities\") pod \"f857c545-4773-4fc8-87b3-37367dd71e20\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.808191 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content\") pod \"f857c545-4773-4fc8-87b3-37367dd71e20\" (UID: \"f857c545-4773-4fc8-87b3-37367dd71e20\") " Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.809457 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities" (OuterVolumeSpecName: "utilities") pod "f857c545-4773-4fc8-87b3-37367dd71e20" (UID: "f857c545-4773-4fc8-87b3-37367dd71e20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.815109 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx" (OuterVolumeSpecName: "kube-api-access-kblrx") pod "f857c545-4773-4fc8-87b3-37367dd71e20" (UID: "f857c545-4773-4fc8-87b3-37367dd71e20"). InnerVolumeSpecName "kube-api-access-kblrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.878243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f857c545-4773-4fc8-87b3-37367dd71e20" (UID: "f857c545-4773-4fc8-87b3-37367dd71e20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.910211 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.910249 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kblrx\" (UniqueName: \"kubernetes.io/projected/f857c545-4773-4fc8-87b3-37367dd71e20-kube-api-access-kblrx\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.910259 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f857c545-4773-4fc8-87b3-37367dd71e20-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:29 crc kubenswrapper[4747]: I1128 13:21:29.953854 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.082515 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.085224 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jz5h9"] Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.112864 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities\") pod \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.112944 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content\") pod \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.112995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxhtd\" (UniqueName: \"kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd\") pod \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\" (UID: \"e62c66db-e2f6-49e2-a663-d8f31f895c1c\") " Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.113880 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities" (OuterVolumeSpecName: "utilities") pod "e62c66db-e2f6-49e2-a663-d8f31f895c1c" (UID: "e62c66db-e2f6-49e2-a663-d8f31f895c1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.116249 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd" (OuterVolumeSpecName: "kube-api-access-rxhtd") pod "e62c66db-e2f6-49e2-a663-d8f31f895c1c" (UID: "e62c66db-e2f6-49e2-a663-d8f31f895c1c"). InnerVolumeSpecName "kube-api-access-rxhtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.214795 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.214835 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxhtd\" (UniqueName: \"kubernetes.io/projected/e62c66db-e2f6-49e2-a663-d8f31f895c1c-kube-api-access-rxhtd\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.763771 4747 generic.go:334] "Generic (PLEG): container finished" podID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerID="b1012297d216e8c71d84d7fe1f462a5609b2ae80034ebb466d4157e39b98a5b5" exitCode=0 Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.763850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerDied","Data":"b1012297d216e8c71d84d7fe1f462a5609b2ae80034ebb466d4157e39b98a5b5"} Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.767757 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4g45d" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.767744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4g45d" event={"ID":"e62c66db-e2f6-49e2-a663-d8f31f895c1c","Type":"ContainerDied","Data":"b390773bb6d7add095367c7c81c9abb37fed7ce856d962b1238143b1a16f1473"} Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.767836 4747 scope.go:117] "RemoveContainer" containerID="771d9959d3be8e3690dd793a66795039910cd97cce616c8b69c6e047ef648404" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.864089 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e62c66db-e2f6-49e2-a663-d8f31f895c1c" (UID: "e62c66db-e2f6-49e2-a663-d8f31f895c1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.898670 4747 scope.go:117] "RemoveContainer" containerID="41bf2be995ae1573a23676347dc6b6cc7bdd7786921601a6784d7aa45309e2b0" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.916391 4747 scope.go:117] "RemoveContainer" containerID="70e548b1b73260c2bd83c93a869b83248270cf109bfe490da1258775c7555646" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.924638 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62c66db-e2f6-49e2-a663-d8f31f895c1c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.953799 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954008 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954020 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954029 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="extract-content" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954035 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="extract-content" Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954046 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="extract-content" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954053 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="extract-content" Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954064 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="extract-utilities" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954070 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="extract-utilities" Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954080 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="extract-utilities" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954085 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="extract-utilities" Nov 28 13:21:30 crc kubenswrapper[4747]: E1128 13:21:30.954095 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954100 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954197 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.954386 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" containerName="registry-server" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.955848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.965033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:21:30 crc kubenswrapper[4747]: I1128 13:21:30.969445 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.084382 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.096270 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.099002 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4g45d"] Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.128696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir\") pod \"3209c35c-09ff-4df0-911d-ec6ca70edd81\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.128755 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access\") pod \"3209c35c-09ff-4df0-911d-ec6ca70edd81\" (UID: \"3209c35c-09ff-4df0-911d-ec6ca70edd81\") " Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.128881 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3209c35c-09ff-4df0-911d-ec6ca70edd81" (UID: "3209c35c-09ff-4df0-911d-ec6ca70edd81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.128901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.129062 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.129141 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.129496 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3209c35c-09ff-4df0-911d-ec6ca70edd81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.133687 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3209c35c-09ff-4df0-911d-ec6ca70edd81" (UID: "3209c35c-09ff-4df0-911d-ec6ca70edd81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230477 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content\") pod \"7759307e-b584-4d97-af0e-d3aebe6f9f08\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkvn7\" (UniqueName: \"kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7\") pod \"7759307e-b584-4d97-af0e-d3aebe6f9f08\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230597 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities\") pod \"7759307e-b584-4d97-af0e-d3aebe6f9f08\" (UID: \"7759307e-b584-4d97-af0e-d3aebe6f9f08\") " Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230938 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.230968 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.231032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.231083 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3209c35c-09ff-4df0-911d-ec6ca70edd81-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.231123 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.231604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities" (OuterVolumeSpecName: "utilities") pod "7759307e-b584-4d97-af0e-d3aebe6f9f08" (UID: "7759307e-b584-4d97-af0e-d3aebe6f9f08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.234802 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7" (OuterVolumeSpecName: "kube-api-access-mkvn7") pod "7759307e-b584-4d97-af0e-d3aebe6f9f08" (UID: "7759307e-b584-4d97-af0e-d3aebe6f9f08"). InnerVolumeSpecName "kube-api-access-mkvn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.247582 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access\") pod \"installer-9-crc\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.284668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.331885 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkvn7\" (UniqueName: \"kubernetes.io/projected/7759307e-b584-4d97-af0e-d3aebe6f9f08-kube-api-access-mkvn7\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.331914 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.471193 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.623484 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7759307e-b584-4d97-af0e-d3aebe6f9f08" (UID: "7759307e-b584-4d97-af0e-d3aebe6f9f08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.635343 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759307e-b584-4d97-af0e-d3aebe6f9f08-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.651670 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62c66db-e2f6-49e2-a663-d8f31f895c1c" path="/var/lib/kubelet/pods/e62c66db-e2f6-49e2-a663-d8f31f895c1c/volumes" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.652599 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f857c545-4773-4fc8-87b3-37367dd71e20" path="/var/lib/kubelet/pods/f857c545-4773-4fc8-87b3-37367dd71e20/volumes" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.780883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brc8n" event={"ID":"7759307e-b584-4d97-af0e-d3aebe6f9f08","Type":"ContainerDied","Data":"97d9b27528d807ee0d36c298c38623988d53eb2648356891584b34978c943d27"} Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.780950 4747 scope.go:117] "RemoveContainer" containerID="b1012297d216e8c71d84d7fe1f462a5609b2ae80034ebb466d4157e39b98a5b5" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.780951 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brc8n" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.782966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3209c35c-09ff-4df0-911d-ec6ca70edd81","Type":"ContainerDied","Data":"a20e48fcc6583e9b37d8cf254c7258250572f35dae902218079140e2a95ee438"} Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.782991 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.783002 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20e48fcc6583e9b37d8cf254c7258250572f35dae902218079140e2a95ee438" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.784480 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"69090d14-4f69-445b-9da6-7cbf9d412a2b","Type":"ContainerStarted","Data":"d2085b88844284f43087faccc12f3cbc83981eb8cb322210c32a467557b83001"} Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.811527 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.815901 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-brc8n"] Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.823413 4747 scope.go:117] "RemoveContainer" containerID="7beb0ba582e6bb3b574680be7f5c3b07f979280bdf997d2eec8c1474ed7d1fdb" Nov 28 13:21:31 crc kubenswrapper[4747]: I1128 13:21:31.842280 4747 scope.go:117] "RemoveContainer" containerID="769fab8f21db10d0540b9548153a1151f9b8c335a7cb39a1447640d6b57e002a" Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.598934 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.794399 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"69090d14-4f69-445b-9da6-7cbf9d412a2b","Type":"ContainerStarted","Data":"a0db3e0886bd5968eb56b017475af57bc2516d564ad11d67ee48f72dbc9877a9"} Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.800073 4747 generic.go:334] "Generic (PLEG): container finished" podID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerID="69e8f940eda06060a16da7d49a02d66eda33de7b5b34971e2afcd6782c5c8dc0" exitCode=0 Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.800108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerDied","Data":"69e8f940eda06060a16da7d49a02d66eda33de7b5b34971e2afcd6782c5c8dc0"} Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.803152 4747 generic.go:334] "Generic (PLEG): container finished" podID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerID="e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d" exitCode=0 Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.803188 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerDied","Data":"e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d"} Nov 28 13:21:32 crc kubenswrapper[4747]: I1128 13:21:32.812853 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.812834361 podStartE2EDuration="2.812834361s" podCreationTimestamp="2025-11-28 13:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:21:32.809641853 +0000 UTC m=+145.472123593" watchObservedRunningTime="2025-11-28 13:21:32.812834361 +0000 UTC m=+145.475316101" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.652185 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" path="/var/lib/kubelet/pods/7759307e-b584-4d97-af0e-d3aebe6f9f08/volumes" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.665919 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.665969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.666014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.666071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.671058 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.671881 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.671995 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.677567 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.677621 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.685299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.692110 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.693423 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.702050 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.706023 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.813796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerStarted","Data":"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50"} Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.817780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerStarted","Data":"5af70dbfe60b9acdeda3ce485d7c6f3319a3abe090d371d6b726ade2fdc121a3"} Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.912792 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q9w47" podStartSLOduration=2.978334932 podStartE2EDuration="49.912774219s" podCreationTimestamp="2025-11-28 13:20:44 +0000 UTC" firstStartedPulling="2025-11-28 13:20:46.310482388 +0000 UTC m=+98.972964118" lastFinishedPulling="2025-11-28 13:21:33.244921665 +0000 UTC m=+145.907403405" observedRunningTime="2025-11-28 13:21:33.864251514 +0000 UTC m=+146.526733244" watchObservedRunningTime="2025-11-28 13:21:33.912774219 +0000 UTC m=+146.575255949" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.913075 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-86pgk" podStartSLOduration=1.678640273 podStartE2EDuration="49.913071776s" podCreationTimestamp="2025-11-28 13:20:44 +0000 UTC" firstStartedPulling="2025-11-28 13:20:45.252835912 +0000 UTC m=+97.915317642" lastFinishedPulling="2025-11-28 13:21:33.487267415 +0000 UTC m=+146.149749145" observedRunningTime="2025-11-28 13:21:33.911349294 +0000 UTC m=+146.573831024" watchObservedRunningTime="2025-11-28 13:21:33.913071776 +0000 UTC m=+146.575553506" Nov 28 13:21:33 crc kubenswrapper[4747]: I1128 13:21:33.992615 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.479117 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.479463 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.831699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9046f21fa2b6ec4717804c4042f31f74725e57843295243195d1c19ffc105b35"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.832050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"00712aadf159660d4af40a238d56113975a20478090b4ecd320d344581f743ad"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.833506 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7f607b840f8feda5d34e50000be83fe603c9ef2af54bde6c9e83161661bf0796"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.833568 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b3fa3c8e6eed053ae11809d99b62f863b469bee03adcda22894e612c24a31299"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.834261 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.835681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"23f2c381f7fb2ce3885150684ad123421c1684ea84534a2d1be37bb96b13fa82"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.835717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"77d14aa1407c1e29d6679817bdf9a742cef06d653a7b4ae08d3949d4c2c0584e"} Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.906743 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.907514 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:34 crc kubenswrapper[4747]: I1128 13:21:34.956523 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:35 crc kubenswrapper[4747]: I1128 13:21:35.527112 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-86pgk" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="registry-server" probeResult="failure" output=< Nov 28 13:21:35 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Nov 28 13:21:35 crc kubenswrapper[4747]: > Nov 28 13:21:44 crc kubenswrapper[4747]: I1128 13:21:44.549968 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:21:44 crc kubenswrapper[4747]: I1128 13:21:44.620639 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:21:44 crc kubenswrapper[4747]: I1128 13:21:44.980793 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:46 crc kubenswrapper[4747]: I1128 13:21:46.569737 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:21:46 crc kubenswrapper[4747]: I1128 13:21:46.570338 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q9w47" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="registry-server" containerID="cri-o://ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50" gracePeriod=2 Nov 28 13:21:47 crc kubenswrapper[4747]: I1128 13:21:47.632753 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:21:47 crc kubenswrapper[4747]: I1128 13:21:47.633300 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.956756 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.958966 4747 generic.go:334] "Generic (PLEG): container finished" podID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerID="ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50" exitCode=0 Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.959008 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerDied","Data":"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50"} Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.959023 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9w47" Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.959043 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9w47" event={"ID":"9319de37-d943-4f15-aec7-c2f0bf0ca64d","Type":"ContainerDied","Data":"f165400fa37a365910360d8af1bf1e9f3aa26f48c0a5e31bbbc351871dad878c"} Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.959068 4747 scope.go:117] "RemoveContainer" containerID="ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50" Nov 28 13:21:50 crc kubenswrapper[4747]: I1128 13:21:50.995999 4747 scope.go:117] "RemoveContainer" containerID="e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.023145 4747 scope.go:117] "RemoveContainer" containerID="f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.033713 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities\") pod \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.033825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content\") pod \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.033855 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxrpl\" (UniqueName: \"kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl\") pod \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\" (UID: \"9319de37-d943-4f15-aec7-c2f0bf0ca64d\") " Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.035760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities" (OuterVolumeSpecName: "utilities") pod "9319de37-d943-4f15-aec7-c2f0bf0ca64d" (UID: "9319de37-d943-4f15-aec7-c2f0bf0ca64d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.040353 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl" (OuterVolumeSpecName: "kube-api-access-dxrpl") pod "9319de37-d943-4f15-aec7-c2f0bf0ca64d" (UID: "9319de37-d943-4f15-aec7-c2f0bf0ca64d"). InnerVolumeSpecName "kube-api-access-dxrpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.074364 4747 scope.go:117] "RemoveContainer" containerID="ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.075450 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9319de37-d943-4f15-aec7-c2f0bf0ca64d" (UID: "9319de37-d943-4f15-aec7-c2f0bf0ca64d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:21:51 crc kubenswrapper[4747]: E1128 13:21:51.075727 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50\": container with ID starting with ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50 not found: ID does not exist" containerID="ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.075769 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50"} err="failed to get container status \"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50\": rpc error: code = NotFound desc = could not find container \"ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50\": container with ID starting with ecdfd868c0211c4153bd4d8326cde46f06e040535e3a150b6dc317a3bf81af50 not found: ID does not exist" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.075817 4747 scope.go:117] "RemoveContainer" containerID="e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d" Nov 28 13:21:51 crc kubenswrapper[4747]: E1128 13:21:51.076616 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d\": container with ID starting with e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d not found: ID does not exist" containerID="e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.076635 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d"} err="failed to get container status \"e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d\": rpc error: code = NotFound desc = could not find container \"e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d\": container with ID starting with e22611e58e396c06675321b1b9e1b24ecff7a9f75fe37ea9744ee23c4c22966d not found: ID does not exist" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.076650 4747 scope.go:117] "RemoveContainer" containerID="f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402" Nov 28 13:21:51 crc kubenswrapper[4747]: E1128 13:21:51.076865 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402\": container with ID starting with f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402 not found: ID does not exist" containerID="f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.076882 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402"} err="failed to get container status \"f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402\": rpc error: code = NotFound desc = could not find container \"f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402\": container with ID starting with f5671755adb7fe8fcf04dcef93f3befc1661ee54eced4389c621e31bf7add402 not found: ID does not exist" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.134984 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.135013 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9319de37-d943-4f15-aec7-c2f0bf0ca64d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.135023 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxrpl\" (UniqueName: \"kubernetes.io/projected/9319de37-d943-4f15-aec7-c2f0bf0ca64d-kube-api-access-dxrpl\") on node \"crc\" DevicePath \"\"" Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.290714 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.296457 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9w47"] Nov 28 13:21:51 crc kubenswrapper[4747]: I1128 13:21:51.649458 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" path="/var/lib/kubelet/pods/9319de37-d943-4f15-aec7-c2f0bf0ca64d/volumes" Nov 28 13:21:53 crc kubenswrapper[4747]: I1128 13:21:53.411977 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-59qwc"] Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.724671 4747 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.725837 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.725939 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49" gracePeriod=15 Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.726071 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a" gracePeriod=15 Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.726127 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf" gracePeriod=15 Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.726175 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193" gracePeriod=15 Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.726202 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051" gracePeriod=15 Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728486 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="extract-utilities" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728512 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="extract-utilities" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728529 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728537 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728547 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728555 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728565 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728572 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728580 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728586 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728593 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728600 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728613 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="extract-content" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728626 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="extract-content" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728640 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="extract-utilities" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728650 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="extract-utilities" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728662 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728670 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728679 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="extract-content" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728687 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="extract-content" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728696 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3209c35c-09ff-4df0-911d-ec6ca70edd81" containerName="pruner" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728705 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3209c35c-09ff-4df0-911d-ec6ca70edd81" containerName="pruner" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728714 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728721 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728731 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728738 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.728745 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728751 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728862 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728878 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728892 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3209c35c-09ff-4df0-911d-ec6ca70edd81" containerName="pruner" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728901 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9319de37-d943-4f15-aec7-c2f0bf0ca64d" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728909 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728916 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7759307e-b584-4d97-af0e-d3aebe6f9f08" containerName="registry-server" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728927 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728936 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.728943 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.730421 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.731172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.736436 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 28 13:22:09 crc kubenswrapper[4747]: E1128 13:22:09.811329 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828062 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828178 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828227 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828252 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828332 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828361 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.828382 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930479 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930533 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930574 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930809 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930620 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930843 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930877 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:09 crc kubenswrapper[4747]: I1128 13:22:09.930926 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.098968 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.100707 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.102001 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a" exitCode=0 Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.102307 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051" exitCode=0 Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.102118 4747 scope.go:117] "RemoveContainer" containerID="13ff4604ebf0ff012858743067d5dedbee29da3458a34e15da0f67a3099071a6" Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.102556 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf" exitCode=0 Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.102733 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193" exitCode=2 Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.106873 4747 generic.go:334] "Generic (PLEG): container finished" podID="69090d14-4f69-445b-9da6-7cbf9d412a2b" containerID="a0db3e0886bd5968eb56b017475af57bc2516d564ad11d67ee48f72dbc9877a9" exitCode=0 Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.106984 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"69090d14-4f69-445b-9da6-7cbf9d412a2b","Type":"ContainerDied","Data":"a0db3e0886bd5968eb56b017475af57bc2516d564ad11d67ee48f72dbc9877a9"} Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.108005 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:10 crc kubenswrapper[4747]: I1128 13:22:10.113428 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:10 crc kubenswrapper[4747]: E1128 13:22:10.142707 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c2e5ccd7a3f88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:22:10.141601672 +0000 UTC m=+182.804083402,LastTimestamp:2025-11-28 13:22:10.141601672 +0000 UTC m=+182.804083402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.120284 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.123988 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479"} Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.124072 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c032deaffe09480840825663651502250332dadb77d476e4a53b97956f0fd532"} Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.125417 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.125428 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.436973 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.437784 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.451559 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock\") pod \"69090d14-4f69-445b-9da6-7cbf9d412a2b\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.451633 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock" (OuterVolumeSpecName: "var-lock") pod "69090d14-4f69-445b-9da6-7cbf9d412a2b" (UID: "69090d14-4f69-445b-9da6-7cbf9d412a2b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.451660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access\") pod \"69090d14-4f69-445b-9da6-7cbf9d412a2b\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.451687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir\") pod \"69090d14-4f69-445b-9da6-7cbf9d412a2b\" (UID: \"69090d14-4f69-445b-9da6-7cbf9d412a2b\") " Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.451828 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "69090d14-4f69-445b-9da6-7cbf9d412a2b" (UID: "69090d14-4f69-445b-9da6-7cbf9d412a2b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.452072 4747 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-var-lock\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.452088 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69090d14-4f69-445b-9da6-7cbf9d412a2b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.458499 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "69090d14-4f69-445b-9da6-7cbf9d412a2b" (UID: "69090d14-4f69-445b-9da6-7cbf9d412a2b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.554706 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69090d14-4f69-445b-9da6-7cbf9d412a2b-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.940544 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.941797 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.942333 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.942795 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.943231 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:11 crc kubenswrapper[4747]: I1128 13:22:11.943373 4747 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 28 13:22:11 crc kubenswrapper[4747]: E1128 13:22:11.943734 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.133292 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.135710 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49" exitCode=0 Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.135938 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b17a361882a47b39b2ebe03c7eadf3f72e6e9b429ceb60c241ee90e6b8ff2a4" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.136774 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.137587 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.138158 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.138391 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.138589 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"69090d14-4f69-445b-9da6-7cbf9d412a2b","Type":"ContainerDied","Data":"d2085b88844284f43087faccc12f3cbc83981eb8cb322210c32a467557b83001"} Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.138719 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2085b88844284f43087faccc12f3cbc83981eb8cb322210c32a467557b83001" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.138861 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.143263 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.143608 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:12 crc kubenswrapper[4747]: E1128 13:22:12.145572 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162463 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162516 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162794 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.162913 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.264554 4747 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.264939 4747 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:12 crc kubenswrapper[4747]: I1128 13:22:12.265092 4747 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:12 crc kubenswrapper[4747]: E1128 13:22:12.546561 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Nov 28 13:22:13 crc kubenswrapper[4747]: E1128 13:22:13.054706 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c2e5ccd7a3f88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:22:10.141601672 +0000 UTC m=+182.804083402,LastTimestamp:2025-11-28 13:22:10.141601672 +0000 UTC m=+182.804083402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 28 13:22:13 crc kubenswrapper[4747]: I1128 13:22:13.144919 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:13 crc kubenswrapper[4747]: I1128 13:22:13.173741 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:13 crc kubenswrapper[4747]: I1128 13:22:13.174358 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:13 crc kubenswrapper[4747]: E1128 13:22:13.347741 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Nov 28 13:22:13 crc kubenswrapper[4747]: I1128 13:22:13.652067 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 28 13:22:14 crc kubenswrapper[4747]: I1128 13:22:14.000749 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:22:14 crc kubenswrapper[4747]: I1128 13:22:14.001771 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:14 crc kubenswrapper[4747]: I1128 13:22:14.002360 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:14 crc kubenswrapper[4747]: E1128 13:22:14.948632 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Nov 28 13:22:17 crc kubenswrapper[4747]: I1128 13:22:17.633460 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:22:17 crc kubenswrapper[4747]: I1128 13:22:17.633903 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:22:17 crc kubenswrapper[4747]: I1128 13:22:17.646189 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:17 crc kubenswrapper[4747]: I1128 13:22:17.646940 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:18 crc kubenswrapper[4747]: E1128 13:22:18.150275 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="6.4s" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.445292 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" containerName="oauth-openshift" containerID="cri-o://3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9" gracePeriod=15 Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.842385 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.842805 4747 status_manager.go:851] "Failed to get status for pod" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-59qwc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.843025 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.843415 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862122 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862528 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862599 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862645 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862682 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862728 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862771 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdcdf\" (UniqueName: \"kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.862989 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863094 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863130 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863245 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863284 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca\") pod \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\" (UID: \"91f9d75e-6ca4-433a-8acd-ad2f23490d9a\") " Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863683 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863912 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.863954 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.865308 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.866005 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.869147 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf" (OuterVolumeSpecName: "kube-api-access-vdcdf") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "kube-api-access-vdcdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.871419 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.871627 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.872109 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.873564 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.874139 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.874492 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.874924 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.875314 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "91f9d75e-6ca4-433a-8acd-ad2f23490d9a" (UID: "91f9d75e-6ca4-433a-8acd-ad2f23490d9a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964770 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964822 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964843 4747 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964864 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964915 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964934 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964954 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964972 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.964991 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.965010 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.965030 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.965051 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdcdf\" (UniqueName: \"kubernetes.io/projected/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-kube-api-access-vdcdf\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:18 crc kubenswrapper[4747]: I1128 13:22:18.965070 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91f9d75e-6ca4-433a-8acd-ad2f23490d9a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.186323 4747 generic.go:334] "Generic (PLEG): container finished" podID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" containerID="3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9" exitCode=0 Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.186397 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" event={"ID":"91f9d75e-6ca4-433a-8acd-ad2f23490d9a","Type":"ContainerDied","Data":"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9"} Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.186410 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.186451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" event={"ID":"91f9d75e-6ca4-433a-8acd-ad2f23490d9a","Type":"ContainerDied","Data":"d53375e11699077812a7a370480c68d238c0c2c6b85f409ade9b9f6fa6d860a2"} Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.186494 4747 scope.go:117] "RemoveContainer" containerID="3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.187889 4747 status_manager.go:851] "Failed to get status for pod" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-59qwc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.188454 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.189058 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.215722 4747 scope.go:117] "RemoveContainer" containerID="3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9" Nov 28 13:22:19 crc kubenswrapper[4747]: E1128 13:22:19.216253 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9\": container with ID starting with 3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9 not found: ID does not exist" containerID="3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.216332 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9"} err="failed to get container status \"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9\": rpc error: code = NotFound desc = could not find container \"3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9\": container with ID starting with 3de8efeda8ed607c0e05d9c703b3ecdc90f02a05f6cc6a2dd9248e1cdff36dc9 not found: ID does not exist" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.216447 4747 status_manager.go:851] "Failed to get status for pod" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-59qwc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.216917 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:19 crc kubenswrapper[4747]: I1128 13:22:19.217398 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.640626 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.642400 4747 status_manager.go:851] "Failed to get status for pod" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-59qwc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.643134 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.643900 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.658918 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.658965 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:20 crc kubenswrapper[4747]: E1128 13:22:20.659716 4747 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:20 crc kubenswrapper[4747]: I1128 13:22:20.660676 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.202777 4747 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="aa1944ebdd0b80af6cdb492eb576b9b8ec64c12455f7a5b498f1bca94ede8e27" exitCode=0 Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.202850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"aa1944ebdd0b80af6cdb492eb576b9b8ec64c12455f7a5b498f1bca94ede8e27"} Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.202915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af0fc94b917d37bd59ed8beff54cc6a0f87729dfe3a43fe7a283ec29f6411fd0"} Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.203402 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.203427 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.203745 4747 status_manager.go:851] "Failed to get status for pod" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" pod="openshift-authentication/oauth-openshift-558db77b4-59qwc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-59qwc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:21 crc kubenswrapper[4747]: E1128 13:22:21.204036 4747 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.204653 4747 status_manager.go:851] "Failed to get status for pod" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:21 crc kubenswrapper[4747]: I1128 13:22:21.205460 4747 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.143:6443: connect: connection refused" Nov 28 13:22:22 crc kubenswrapper[4747]: I1128 13:22:22.213800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39dffd1cafa3bccba90937bdbe6391a5e5ac6a86bdea4e8ec47ff672dcbfed14"} Nov 28 13:22:22 crc kubenswrapper[4747]: I1128 13:22:22.214157 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d66b4464ea9dffa892d1331fee0e4fa4b87805aa6965aebd9176185f37ed6b57"} Nov 28 13:22:22 crc kubenswrapper[4747]: I1128 13:22:22.214176 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"49c3836759c9016c2bc50d6a5ef8173c68ce5edf4af336206fb89394445eb54c"} Nov 28 13:22:22 crc kubenswrapper[4747]: I1128 13:22:22.214188 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c9a914e8b7f9ba1be1cba5c85a4dd378e78a9f8eae38deeb14e973fc082efb4"} Nov 28 13:22:23 crc kubenswrapper[4747]: I1128 13:22:23.221159 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"038e0bedcfdb87879c6febd009a7b9eb64ca087844b13eaffdbc24f0b5a1c889"} Nov 28 13:22:23 crc kubenswrapper[4747]: I1128 13:22:23.221643 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:23 crc kubenswrapper[4747]: I1128 13:22:23.221657 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:23 crc kubenswrapper[4747]: I1128 13:22:23.222051 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:24 crc kubenswrapper[4747]: I1128 13:22:24.975596 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 28 13:22:24 crc kubenswrapper[4747]: I1128 13:22:24.976195 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.242251 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.242338 4747 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614" exitCode=1 Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.242385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614"} Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.243149 4747 scope.go:117] "RemoveContainer" containerID="6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614" Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.662993 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.663064 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:25 crc kubenswrapper[4747]: I1128 13:22:25.671862 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:26 crc kubenswrapper[4747]: I1128 13:22:26.252759 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 28 13:22:26 crc kubenswrapper[4747]: I1128 13:22:26.252842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b2491b89f7068a7356c96341e136d461ce79bb1aab3c8a7d74979444513864da"} Nov 28 13:22:27 crc kubenswrapper[4747]: I1128 13:22:27.118178 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:22:27 crc kubenswrapper[4747]: I1128 13:22:27.118535 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 28 13:22:27 crc kubenswrapper[4747]: I1128 13:22:27.118621 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 28 13:22:27 crc kubenswrapper[4747]: I1128 13:22:27.706142 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:22:28 crc kubenswrapper[4747]: I1128 13:22:28.229708 4747 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:28 crc kubenswrapper[4747]: I1128 13:22:28.265353 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:28 crc kubenswrapper[4747]: I1128 13:22:28.265385 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:28 crc kubenswrapper[4747]: I1128 13:22:28.268732 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:28 crc kubenswrapper[4747]: I1128 13:22:28.320077 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4728506a-5838-4daa-a41c-88b1ca87c542" Nov 28 13:22:29 crc kubenswrapper[4747]: I1128 13:22:29.272146 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:29 crc kubenswrapper[4747]: I1128 13:22:29.272190 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:29 crc kubenswrapper[4747]: I1128 13:22:29.275292 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4728506a-5838-4daa-a41c-88b1ca87c542" Nov 28 13:22:34 crc kubenswrapper[4747]: I1128 13:22:34.500189 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:22:34 crc kubenswrapper[4747]: I1128 13:22:34.625684 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:22:34 crc kubenswrapper[4747]: I1128 13:22:34.724043 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 28 13:22:35 crc kubenswrapper[4747]: I1128 13:22:35.317511 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 28 13:22:35 crc kubenswrapper[4747]: I1128 13:22:35.868765 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 28 13:22:36 crc kubenswrapper[4747]: I1128 13:22:36.129330 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 28 13:22:36 crc kubenswrapper[4747]: I1128 13:22:36.333816 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 28 13:22:36 crc kubenswrapper[4747]: I1128 13:22:36.348692 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.118790 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.119452 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.560155 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.646978 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.711801 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.795700 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 28 13:22:37 crc kubenswrapper[4747]: I1128 13:22:37.985117 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.133526 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.194502 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.316260 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.423918 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.481822 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.530805 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 28 13:22:38 crc kubenswrapper[4747]: I1128 13:22:38.694177 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.342402 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.351101 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.397246 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.574949 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.608551 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.646001 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.781963 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 28 13:22:39 crc kubenswrapper[4747]: I1128 13:22:39.810400 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.054602 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.075662 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.124094 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.161289 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.283887 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.287826 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.385733 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.543996 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.598927 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 28 13:22:40 crc kubenswrapper[4747]: I1128 13:22:40.665789 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.049408 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.217639 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.446381 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.856623 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.951462 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.974578 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 28 13:22:41 crc kubenswrapper[4747]: I1128 13:22:41.974669 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.048905 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.064420 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.362646 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.417960 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.428379 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.494966 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.626916 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.848173 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 28 13:22:42 crc kubenswrapper[4747]: I1128 13:22:42.885789 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.301124 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.544657 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.573769 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.577169 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.704951 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.815731 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.916788 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.924094 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.932690 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 28 13:22:43 crc kubenswrapper[4747]: I1128 13:22:43.967519 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.094473 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.111056 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.224116 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.618793 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.635896 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.690613 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.780914 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.952571 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 28 13:22:44 crc kubenswrapper[4747]: I1128 13:22:44.980146 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.192616 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.281743 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.291178 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.321661 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.387984 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.411443 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.518109 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.555957 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.654980 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.720107 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.726401 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.755494 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.826734 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 28 13:22:45 crc kubenswrapper[4747]: I1128 13:22:45.911342 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.039514 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.042587 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.251286 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.302915 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.453061 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.578142 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.590283 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.653454 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.662899 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.686792 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.747105 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.783305 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.797436 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 28 13:22:46 crc kubenswrapper[4747]: I1128 13:22:46.828964 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.038591 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.115152 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.118535 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.118601 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.118660 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.119510 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b2491b89f7068a7356c96341e136d461ce79bb1aab3c8a7d74979444513864da"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.119691 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b2491b89f7068a7356c96341e136d461ce79bb1aab3c8a7d74979444513864da" gracePeriod=30 Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.199388 4747 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.312045 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.338306 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.344749 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.365505 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.447498 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.454979 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.578871 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.610419 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.621613 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.633289 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.633357 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.633416 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.634565 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.634639 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd" gracePeriod=600 Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.664295 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.664459 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.673735 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.751387 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 28 13:22:47 crc kubenswrapper[4747]: I1128 13:22:47.947673 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.008966 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.009099 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.272869 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.279886 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.301976 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.394098 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd" exitCode=0 Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.394160 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd"} Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.394281 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6"} Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.403476 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.491531 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.526638 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.602718 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.671850 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.702317 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.709440 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.717393 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.797389 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 28 13:22:48 crc kubenswrapper[4747]: I1128 13:22:48.799353 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.012418 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.012458 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.058158 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.109382 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.151060 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.224504 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.304474 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.309590 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.343905 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.382591 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.406851 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.419268 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.422965 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.437019 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.495698 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.550313 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.616740 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.637471 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.655986 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.673203 4747 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.698816 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.778913 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.876036 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 28 13:22:49 crc kubenswrapper[4747]: I1128 13:22:49.943114 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.075308 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.174348 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.271486 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.277995 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.282970 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.314746 4747 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.362146 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.391399 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.391505 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.409878 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.412719 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.444725 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.528554 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.676543 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.730361 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.824175 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.904267 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.927021 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 28 13:22:50 crc kubenswrapper[4747]: I1128 13:22:50.937815 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.041894 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.041899 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.061627 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.183106 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.418237 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.481569 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.531724 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.599139 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.644441 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.655826 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.674986 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.721196 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.825916 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.840617 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.866760 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 28 13:22:51 crc kubenswrapper[4747]: I1128 13:22:51.917785 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.070756 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.216249 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.261940 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.347092 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.411761 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.438775 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.519121 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.595630 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.691641 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.728196 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 28 13:22:52 crc kubenswrapper[4747]: I1128 13:22:52.806666 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.139357 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.306288 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.357627 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.674536 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.759783 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 28 13:22:53 crc kubenswrapper[4747]: I1128 13:22:53.976370 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.251807 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.273674 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.427765 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.451676 4747 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.454296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.784509 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.846805 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.869972 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 28 13:22:54 crc kubenswrapper[4747]: I1128 13:22:54.970296 4747 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.010711 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.031160 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.063177 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.068487 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.082387 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.189752 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.384075 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.574078 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.594198 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.888824 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 28 13:22:55 crc kubenswrapper[4747]: I1128 13:22:55.944550 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.029123 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.052185 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.137827 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.315837 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.540310 4747 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544321 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-59qwc"] Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544371 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q"] Nov 28 13:22:57 crc kubenswrapper[4747]: E1128 13:22:57.544539 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" containerName="oauth-openshift" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544555 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" containerName="oauth-openshift" Nov 28 13:22:57 crc kubenswrapper[4747]: E1128 13:22:57.544565 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" containerName="installer" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544571 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" containerName="installer" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544667 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" containerName="oauth-openshift" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544682 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69090d14-4f69-445b-9da6-7cbf9d412a2b" containerName="installer" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544692 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.544712 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="496bd000-e0cc-4af2-a1a6-04f392e21371" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.545051 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.547192 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.548415 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.548681 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.548950 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.549174 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.549334 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.549650 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.549853 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.550623 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.550936 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.551426 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.552253 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.554842 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.564662 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.567579 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.570256 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.598532 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=29.598513792 podStartE2EDuration="29.598513792s" podCreationTimestamp="2025-11-28 13:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:22:57.594183174 +0000 UTC m=+230.256664954" watchObservedRunningTime="2025-11-28 13:22:57.598513792 +0000 UTC m=+230.260995532" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612318 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-router-certs\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612339 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4af7f847-90e3-4314-86d9-5af59e908f83-audit-dir\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612386 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjk7j\" (UniqueName: \"kubernetes.io/projected/4af7f847-90e3-4314-86d9-5af59e908f83-kube-api-access-sjk7j\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612418 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-login\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612446 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612470 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612549 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-error\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612612 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-session\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612733 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-audit-policies\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612798 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.612845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-service-ca\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.648396 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f9d75e-6ca4-433a-8acd-ad2f23490d9a" path="/var/lib/kubelet/pods/91f9d75e-6ca4-433a-8acd-ad2f23490d9a/volumes" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-error\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-session\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713653 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713692 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-audit-policies\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713812 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-service-ca\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713901 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-router-certs\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713932 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4af7f847-90e3-4314-86d9-5af59e908f83-audit-dir\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.713962 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjk7j\" (UniqueName: \"kubernetes.io/projected/4af7f847-90e3-4314-86d9-5af59e908f83-kube-api-access-sjk7j\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.714032 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-login\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.714070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.714100 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.716175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4af7f847-90e3-4314-86d9-5af59e908f83-audit-dir\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.717081 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-service-ca\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.718813 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.719861 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-audit-policies\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.720183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.722840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.722844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.723328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.723510 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-router-certs\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.723858 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-error\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.726034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-login\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.726636 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-system-session\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.730639 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4af7f847-90e3-4314-86d9-5af59e908f83-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.737916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjk7j\" (UniqueName: \"kubernetes.io/projected/4af7f847-90e3-4314-86d9-5af59e908f83-kube-api-access-sjk7j\") pod \"oauth-openshift-5ff5db57ff-8fv5q\" (UID: \"4af7f847-90e3-4314-86d9-5af59e908f83\") " pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:57 crc kubenswrapper[4747]: I1128 13:22:57.861090 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.051615 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.085949 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q"] Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.457113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" event={"ID":"4af7f847-90e3-4314-86d9-5af59e908f83","Type":"ContainerStarted","Data":"fc36b761214d516090ae1c833261480cef37a111e778d849bc06d3361cec851e"} Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.457192 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" event={"ID":"4af7f847-90e3-4314-86d9-5af59e908f83","Type":"ContainerStarted","Data":"7c116e8bde65fb760fb7b5f9f19e7327f3c7aaa982f8f5cdfbbc72217a1b3f41"} Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.486152 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" podStartSLOduration=65.486125356 podStartE2EDuration="1m5.486125356s" podCreationTimestamp="2025-11-28 13:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:22:58.483289945 +0000 UTC m=+231.145771705" watchObservedRunningTime="2025-11-28 13:22:58.486125356 +0000 UTC m=+231.148607106" Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.906444 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 28 13:22:58 crc kubenswrapper[4747]: I1128 13:22:58.920498 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 28 13:22:59 crc kubenswrapper[4747]: I1128 13:22:59.463030 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:22:59 crc kubenswrapper[4747]: I1128 13:22:59.471788 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5ff5db57ff-8fv5q" Nov 28 13:23:01 crc kubenswrapper[4747]: I1128 13:23:01.980880 4747 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 28 13:23:01 crc kubenswrapper[4747]: I1128 13:23:01.981614 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479" gracePeriod=5 Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.138864 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.139371 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250480 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250513 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250607 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250627 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250643 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250791 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250774 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.250865 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.259931 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.351661 4747 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.351699 4747 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.351712 4747 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.351723 4747 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.351735 4747 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.512694 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.512741 4747 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479" exitCode=137 Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.512788 4747 scope.go:117] "RemoveContainer" containerID="3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.512887 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.529505 4747 scope.go:117] "RemoveContainer" containerID="3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479" Nov 28 13:23:07 crc kubenswrapper[4747]: E1128 13:23:07.529863 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479\": container with ID starting with 3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479 not found: ID does not exist" containerID="3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.529893 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479"} err="failed to get container status \"3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479\": rpc error: code = NotFound desc = could not find container \"3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479\": container with ID starting with 3a9cce103f1854b6e2d5cd33b21f501e33db5c26abfe274c8fbcf8cea92e5479 not found: ID does not exist" Nov 28 13:23:07 crc kubenswrapper[4747]: I1128 13:23:07.654725 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 28 13:23:12 crc kubenswrapper[4747]: I1128 13:23:12.551636 4747 generic.go:334] "Generic (PLEG): container finished" podID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerID="656fd1f5e0c28a80a1331711c8e141c4bc0c7bf14e7a07d5eb92c85e7eb8b88c" exitCode=0 Nov 28 13:23:12 crc kubenswrapper[4747]: I1128 13:23:12.551865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerDied","Data":"656fd1f5e0c28a80a1331711c8e141c4bc0c7bf14e7a07d5eb92c85e7eb8b88c"} Nov 28 13:23:12 crc kubenswrapper[4747]: I1128 13:23:12.553026 4747 scope.go:117] "RemoveContainer" containerID="656fd1f5e0c28a80a1331711c8e141c4bc0c7bf14e7a07d5eb92c85e7eb8b88c" Nov 28 13:23:13 crc kubenswrapper[4747]: I1128 13:23:13.563915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerStarted","Data":"e7c4446189b44f89a77c16d2212e1ab5a21d69d3af35fd461a2d33fcbdebd2df"} Nov 28 13:23:13 crc kubenswrapper[4747]: I1128 13:23:13.565469 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:23:13 crc kubenswrapper[4747]: I1128 13:23:13.568164 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:23:17 crc kubenswrapper[4747]: I1128 13:23:17.598370 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 28 13:23:17 crc kubenswrapper[4747]: I1128 13:23:17.619355 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 28 13:23:17 crc kubenswrapper[4747]: I1128 13:23:17.619464 4747 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b2491b89f7068a7356c96341e136d461ce79bb1aab3c8a7d74979444513864da" exitCode=137 Nov 28 13:23:17 crc kubenswrapper[4747]: I1128 13:23:17.619517 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b2491b89f7068a7356c96341e136d461ce79bb1aab3c8a7d74979444513864da"} Nov 28 13:23:17 crc kubenswrapper[4747]: I1128 13:23:17.619579 4747 scope.go:117] "RemoveContainer" containerID="6d7f9506c11d6f7da6cbc69255dafef8e0c88992190ebcc21c521d171b44d614" Nov 28 13:23:18 crc kubenswrapper[4747]: I1128 13:23:18.626558 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 28 13:23:18 crc kubenswrapper[4747]: I1128 13:23:18.628764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8a899bb7dddc756eaa50f5cc5b948a6f2e7d880e688b2fa9ac2dd24685c43f9d"} Nov 28 13:23:27 crc kubenswrapper[4747]: I1128 13:23:27.117834 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:27 crc kubenswrapper[4747]: I1128 13:23:27.124789 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:27 crc kubenswrapper[4747]: I1128 13:23:27.685922 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:28 crc kubenswrapper[4747]: I1128 13:23:28.700283 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.102585 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.103353 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerName="route-controller-manager" containerID="cri-o://65ef96eace41e35acc874546075a8eb4cb946e210efea8db6ca381d15c691633" gracePeriod=30 Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.113957 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.114418 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerName="controller-manager" containerID="cri-o://a698a1c1fafd60a1fa7f693de98e5a04457a8d5aa54733cdc08f68c082c158e2" gracePeriod=30 Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.782367 4747 generic.go:334] "Generic (PLEG): container finished" podID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerID="65ef96eace41e35acc874546075a8eb4cb946e210efea8db6ca381d15c691633" exitCode=0 Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.782773 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" event={"ID":"d89e4b8f-9b72-467e-bc54-c4e6421717ac","Type":"ContainerDied","Data":"65ef96eace41e35acc874546075a8eb4cb946e210efea8db6ca381d15c691633"} Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.784786 4747 generic.go:334] "Generic (PLEG): container finished" podID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerID="a698a1c1fafd60a1fa7f693de98e5a04457a8d5aa54733cdc08f68c082c158e2" exitCode=0 Nov 28 13:23:39 crc kubenswrapper[4747]: I1128 13:23:39.784819 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" event={"ID":"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f","Type":"ContainerDied","Data":"a698a1c1fafd60a1fa7f693de98e5a04457a8d5aa54733cdc08f68c082c158e2"} Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.228602 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.239505 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252315 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:23:40 crc kubenswrapper[4747]: E1128 13:23:40.252574 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerName="controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252594 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerName="controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: E1128 13:23:40.252609 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerName="route-controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252620 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerName="route-controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: E1128 13:23:40.252638 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252645 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252768 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252782 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" containerName="route-controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.252792 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" containerName="controller-manager" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.253211 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.265668 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419121 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config\") pod \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419184 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles\") pod \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419239 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2zkl\" (UniqueName: \"kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl\") pod \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert\") pod \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419299 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config\") pod \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419347 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca\") pod \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\" (UID: \"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419392 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca\") pod \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419425 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert\") pod \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnqkz\" (UniqueName: \"kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz\") pod \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\" (UID: \"d89e4b8f-9b72-467e-bc54-c4e6421717ac\") " Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419644 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjd5\" (UniqueName: \"kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419789 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.419810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.420048 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config" (OuterVolumeSpecName: "config") pod "d89e4b8f-9b72-467e-bc54-c4e6421717ac" (UID: "d89e4b8f-9b72-467e-bc54-c4e6421717ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.420155 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" (UID: "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.420535 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca" (OuterVolumeSpecName: "client-ca") pod "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" (UID: "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.420543 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "d89e4b8f-9b72-467e-bc54-c4e6421717ac" (UID: "d89e4b8f-9b72-467e-bc54-c4e6421717ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.420882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config" (OuterVolumeSpecName: "config") pod "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" (UID: "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.425471 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl" (OuterVolumeSpecName: "kube-api-access-g2zkl") pod "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" (UID: "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f"). InnerVolumeSpecName "kube-api-access-g2zkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.425988 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d89e4b8f-9b72-467e-bc54-c4e6421717ac" (UID: "d89e4b8f-9b72-467e-bc54-c4e6421717ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.426404 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz" (OuterVolumeSpecName: "kube-api-access-bnqkz") pod "d89e4b8f-9b72-467e-bc54-c4e6421717ac" (UID: "d89e4b8f-9b72-467e-bc54-c4e6421717ac"). InnerVolumeSpecName "kube-api-access-bnqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.427772 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" (UID: "a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521422 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521470 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjd5\" (UniqueName: \"kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521519 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521595 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521661 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521680 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521692 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2zkl\" (UniqueName: \"kubernetes.io/projected/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-kube-api-access-g2zkl\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521700 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521709 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521719 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521729 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d89e4b8f-9b72-467e-bc54-c4e6421717ac-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521738 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d89e4b8f-9b72-467e-bc54-c4e6421717ac-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.521750 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnqkz\" (UniqueName: \"kubernetes.io/projected/d89e4b8f-9b72-467e-bc54-c4e6421717ac-kube-api-access-bnqkz\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.522839 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.522952 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.523324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.529193 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.536242 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjd5\" (UniqueName: \"kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5\") pod \"controller-manager-6c7d768ccb-fln9d\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.576360 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.793485 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.796262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-926zp" event={"ID":"a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f","Type":"ContainerDied","Data":"6ad209ca8f97a2f5d0ace27c3a5d57637772124dc95d694515e37aba73461f62"} Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.796301 4747 scope.go:117] "RemoveContainer" containerID="a698a1c1fafd60a1fa7f693de98e5a04457a8d5aa54733cdc08f68c082c158e2" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.797890 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" event={"ID":"d89e4b8f-9b72-467e-bc54-c4e6421717ac","Type":"ContainerDied","Data":"ce3b434a232534819c998489ef7c6be067cc0b57c4fb998e0f3e30972c1b9922"} Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.797983 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.813962 4747 scope.go:117] "RemoveContainer" containerID="65ef96eace41e35acc874546075a8eb4cb946e210efea8db6ca381d15c691633" Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.823692 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.835332 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.838587 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-67648"] Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.846124 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:23:40 crc kubenswrapper[4747]: I1128 13:23:40.853568 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-926zp"] Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.650760 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f" path="/var/lib/kubelet/pods/a23784fb-7b52-4a0c-9b05-08e7ed5c0d7f/volumes" Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.652139 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89e4b8f-9b72-467e-bc54-c4e6421717ac" path="/var/lib/kubelet/pods/d89e4b8f-9b72-467e-bc54-c4e6421717ac/volumes" Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.807028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" event={"ID":"89310d93-7fdd-4c40-be1d-3da6c18c9f51","Type":"ContainerStarted","Data":"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da"} Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.807075 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" event={"ID":"89310d93-7fdd-4c40-be1d-3da6c18c9f51","Type":"ContainerStarted","Data":"7983955f34a8b94ab419fbfce9c9c12d21372de60a7d04cf5457e7f53fd4232f"} Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.807672 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.813148 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:23:41 crc kubenswrapper[4747]: I1128 13:23:41.849088 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" podStartSLOduration=2.849066039 podStartE2EDuration="2.849066039s" podCreationTimestamp="2025-11-28 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:23:41.824597284 +0000 UTC m=+274.487079014" watchObservedRunningTime="2025-11-28 13:23:41.849066039 +0000 UTC m=+274.511547799" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.626801 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.628122 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.630252 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.630265 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.631322 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.631498 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.631710 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.631952 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.635930 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.751043 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.751103 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2gn\" (UniqueName: \"kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.751126 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.751140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.852918 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.852994 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2gn\" (UniqueName: \"kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.853026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.853046 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.854280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.856856 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.860469 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.878988 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2gn\" (UniqueName: \"kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn\") pod \"route-controller-manager-7fcb84bb5d-2qt2w\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:42 crc kubenswrapper[4747]: I1128 13:23:42.951864 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.164412 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.822542 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" event={"ID":"3d75d9b7-7aaf-4323-8712-e48e221128d3","Type":"ContainerStarted","Data":"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f"} Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.822827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" event={"ID":"3d75d9b7-7aaf-4323-8712-e48e221128d3","Type":"ContainerStarted","Data":"0f59647c577fbfe90707393696f4b05ed9e11cac1cd3402dc64aec8c0d5b0214"} Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.822965 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.833306 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:23:43 crc kubenswrapper[4747]: I1128 13:23:43.856016 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" podStartSLOduration=4.855995865 podStartE2EDuration="4.855995865s" podCreationTimestamp="2025-11-28 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:23:43.854274681 +0000 UTC m=+276.516756411" watchObservedRunningTime="2025-11-28 13:23:43.855995865 +0000 UTC m=+276.518477585" Nov 28 13:24:07 crc kubenswrapper[4747]: I1128 13:24:07.479379 4747 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.289794 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gccdc"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.291340 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.306371 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gccdc"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.452928 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-trusted-ca\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.452991 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvkf6\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-kube-api-access-qvkf6\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453015 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-certificates\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b8465de-0e54-41a1-986c-c56ac227e6d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453066 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453085 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-tls\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453124 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b8465de-0e54-41a1-986c-c56ac227e6d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.453139 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-bound-sa-token\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.474032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.554969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-trusted-ca\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555053 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvkf6\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-kube-api-access-qvkf6\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555082 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-certificates\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555105 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b8465de-0e54-41a1-986c-c56ac227e6d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-tls\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555187 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b8465de-0e54-41a1-986c-c56ac227e6d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.555229 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-bound-sa-token\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.556251 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b8465de-0e54-41a1-986c-c56ac227e6d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.556685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-certificates\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.557953 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b8465de-0e54-41a1-986c-c56ac227e6d0-trusted-ca\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.563308 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b8465de-0e54-41a1-986c-c56ac227e6d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.563364 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-registry-tls\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.571669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-bound-sa-token\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.572458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvkf6\" (UniqueName: \"kubernetes.io/projected/9b8465de-0e54-41a1-986c-c56ac227e6d0-kube-api-access-qvkf6\") pod \"image-registry-66df7c8f76-gccdc\" (UID: \"9b8465de-0e54-41a1-986c-c56ac227e6d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.613231 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.694339 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.694716 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7khwx" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="registry-server" containerID="cri-o://907f61ef0c548664a6655890d1f078da6fa91d84c74e7cbdbbb7165d1d45bfd4" gracePeriod=30 Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.701266 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.715316 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.715825 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" containerID="cri-o://e7c4446189b44f89a77c16d2212e1ab5a21d69d3af35fd461a2d33fcbdebd2df" gracePeriod=30 Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.716060 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6d995" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="registry-server" containerID="cri-o://cd9084b0d713eda79fa288e5c48b0d0ef50840c78894fad81451d4686129be55" gracePeriod=30 Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.725027 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.725463 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-86pgk" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="registry-server" containerID="cri-o://5af70dbfe60b9acdeda3ce485d7c6f3319a3abe090d371d6b726ade2fdc121a3" gracePeriod=30 Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.727083 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.727409 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvsv5" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="registry-server" containerID="cri-o://8ac6e215e0585923eab640f34afcd9c8f2017e167b3a48ec06f6542870e85ed9" gracePeriod=30 Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.733239 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb4cp"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.733887 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.765510 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb4cp"] Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.784887 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.785068 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6q52\" (UniqueName: \"kubernetes.io/projected/f8ce1410-e45d-4cb9-a8b3-de758929de4b-kube-api-access-t6q52\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.785101 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.888570 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6q52\" (UniqueName: \"kubernetes.io/projected/f8ce1410-e45d-4cb9-a8b3-de758929de4b-kube-api-access-t6q52\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.888635 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.888667 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.891522 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.898908 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8ce1410-e45d-4cb9-a8b3-de758929de4b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:21 crc kubenswrapper[4747]: I1128 13:24:21.905142 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6q52\" (UniqueName: \"kubernetes.io/projected/f8ce1410-e45d-4cb9-a8b3-de758929de4b-kube-api-access-t6q52\") pod \"marketplace-operator-79b997595-bb4cp\" (UID: \"f8ce1410-e45d-4cb9-a8b3-de758929de4b\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.041696 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerID="8ac6e215e0585923eab640f34afcd9c8f2017e167b3a48ec06f6542870e85ed9" exitCode=0 Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.041782 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerDied","Data":"8ac6e215e0585923eab640f34afcd9c8f2017e167b3a48ec06f6542870e85ed9"} Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.043031 4747 generic.go:334] "Generic (PLEG): container finished" podID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerID="cd9084b0d713eda79fa288e5c48b0d0ef50840c78894fad81451d4686129be55" exitCode=0 Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.043069 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerDied","Data":"cd9084b0d713eda79fa288e5c48b0d0ef50840c78894fad81451d4686129be55"} Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.044983 4747 generic.go:334] "Generic (PLEG): container finished" podID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerID="5af70dbfe60b9acdeda3ce485d7c6f3319a3abe090d371d6b726ade2fdc121a3" exitCode=0 Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.045021 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerDied","Data":"5af70dbfe60b9acdeda3ce485d7c6f3319a3abe090d371d6b726ade2fdc121a3"} Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.048480 4747 generic.go:334] "Generic (PLEG): container finished" podID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerID="e7c4446189b44f89a77c16d2212e1ab5a21d69d3af35fd461a2d33fcbdebd2df" exitCode=0 Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.048523 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerDied","Data":"e7c4446189b44f89a77c16d2212e1ab5a21d69d3af35fd461a2d33fcbdebd2df"} Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.048550 4747 scope.go:117] "RemoveContainer" containerID="656fd1f5e0c28a80a1331711c8e141c4bc0c7bf14e7a07d5eb92c85e7eb8b88c" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.055592 4747 generic.go:334] "Generic (PLEG): container finished" podID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerID="907f61ef0c548664a6655890d1f078da6fa91d84c74e7cbdbbb7165d1d45bfd4" exitCode=0 Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.055628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerDied","Data":"907f61ef0c548664a6655890d1f078da6fa91d84c74e7cbdbbb7165d1d45bfd4"} Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.073733 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.091157 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gccdc"] Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.129983 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.224099 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.247191 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.249895 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.276154 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.292888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72884\" (UniqueName: \"kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884\") pod \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.293009 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content\") pod \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.293061 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities\") pod \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\" (UID: \"cca84f8d-3b79-44e5-8de8-af6bc47e7bba\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.294136 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities" (OuterVolumeSpecName: "utilities") pod "cca84f8d-3b79-44e5-8de8-af6bc47e7bba" (UID: "cca84f8d-3b79-44e5-8de8-af6bc47e7bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.299371 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884" (OuterVolumeSpecName: "kube-api-access-72884") pod "cca84f8d-3b79-44e5-8de8-af6bc47e7bba" (UID: "cca84f8d-3b79-44e5-8de8-af6bc47e7bba"). InnerVolumeSpecName "kube-api-access-72884". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.329038 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb4cp"] Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.363319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cca84f8d-3b79-44e5-8de8-af6bc47e7bba" (UID: "cca84f8d-3b79-44e5-8de8-af6bc47e7bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394635 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content\") pod \"0a8a42c3-62aa-4542-b86e-171e124c81f4\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394707 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc8x6\" (UniqueName: \"kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6\") pod \"e05164c9-17fa-41ef-9abe-b00460c2cb96\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394743 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities\") pod \"e05164c9-17fa-41ef-9abe-b00460c2cb96\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394770 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics\") pod \"6f46540d-f949-4ebd-aa09-0336f09ddfef\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394842 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca\") pod \"6f46540d-f949-4ebd-aa09-0336f09ddfef\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t8gb\" (UniqueName: \"kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb\") pod \"0a8a42c3-62aa-4542-b86e-171e124c81f4\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394921 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79bz9\" (UniqueName: \"kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9\") pod \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394958 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content\") pod \"e05164c9-17fa-41ef-9abe-b00460c2cb96\" (UID: \"e05164c9-17fa-41ef-9abe-b00460c2cb96\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.394990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities\") pod \"0a8a42c3-62aa-4542-b86e-171e124c81f4\" (UID: \"0a8a42c3-62aa-4542-b86e-171e124c81f4\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhp2f\" (UniqueName: \"kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f\") pod \"6f46540d-f949-4ebd-aa09-0336f09ddfef\" (UID: \"6f46540d-f949-4ebd-aa09-0336f09ddfef\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395053 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content\") pod \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395090 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities\") pod \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\" (UID: \"4e4d880f-9a11-4d82-b099-1fbd6cae11ec\") " Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395569 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395590 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.395602 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72884\" (UniqueName: \"kubernetes.io/projected/cca84f8d-3b79-44e5-8de8-af6bc47e7bba-kube-api-access-72884\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.396371 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities" (OuterVolumeSpecName: "utilities") pod "4e4d880f-9a11-4d82-b099-1fbd6cae11ec" (UID: "4e4d880f-9a11-4d82-b099-1fbd6cae11ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.397387 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities" (OuterVolumeSpecName: "utilities") pod "e05164c9-17fa-41ef-9abe-b00460c2cb96" (UID: "e05164c9-17fa-41ef-9abe-b00460c2cb96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.397674 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities" (OuterVolumeSpecName: "utilities") pod "0a8a42c3-62aa-4542-b86e-171e124c81f4" (UID: "0a8a42c3-62aa-4542-b86e-171e124c81f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.398109 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6f46540d-f949-4ebd-aa09-0336f09ddfef" (UID: "6f46540d-f949-4ebd-aa09-0336f09ddfef"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.399551 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6" (OuterVolumeSpecName: "kube-api-access-kc8x6") pod "e05164c9-17fa-41ef-9abe-b00460c2cb96" (UID: "e05164c9-17fa-41ef-9abe-b00460c2cb96"). InnerVolumeSpecName "kube-api-access-kc8x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.399995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6f46540d-f949-4ebd-aa09-0336f09ddfef" (UID: "6f46540d-f949-4ebd-aa09-0336f09ddfef"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.418421 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f" (OuterVolumeSpecName: "kube-api-access-nhp2f") pod "6f46540d-f949-4ebd-aa09-0336f09ddfef" (UID: "6f46540d-f949-4ebd-aa09-0336f09ddfef"). InnerVolumeSpecName "kube-api-access-nhp2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.421705 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb" (OuterVolumeSpecName: "kube-api-access-5t8gb") pod "0a8a42c3-62aa-4542-b86e-171e124c81f4" (UID: "0a8a42c3-62aa-4542-b86e-171e124c81f4"). InnerVolumeSpecName "kube-api-access-5t8gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.423545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9" (OuterVolumeSpecName: "kube-api-access-79bz9") pod "4e4d880f-9a11-4d82-b099-1fbd6cae11ec" (UID: "4e4d880f-9a11-4d82-b099-1fbd6cae11ec"). InnerVolumeSpecName "kube-api-access-79bz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.439923 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8a42c3-62aa-4542-b86e-171e124c81f4" (UID: "0a8a42c3-62aa-4542-b86e-171e124c81f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.458904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e05164c9-17fa-41ef-9abe-b00460c2cb96" (UID: "e05164c9-17fa-41ef-9abe-b00460c2cb96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497143 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc8x6\" (UniqueName: \"kubernetes.io/projected/e05164c9-17fa-41ef-9abe-b00460c2cb96-kube-api-access-kc8x6\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497482 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497501 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497521 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f46540d-f949-4ebd-aa09-0336f09ddfef-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497534 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t8gb\" (UniqueName: \"kubernetes.io/projected/0a8a42c3-62aa-4542-b86e-171e124c81f4-kube-api-access-5t8gb\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497545 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79bz9\" (UniqueName: \"kubernetes.io/projected/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-kube-api-access-79bz9\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497558 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05164c9-17fa-41ef-9abe-b00460c2cb96-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497566 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497576 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhp2f\" (UniqueName: \"kubernetes.io/projected/6f46540d-f949-4ebd-aa09-0336f09ddfef-kube-api-access-nhp2f\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497585 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.497593 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8a42c3-62aa-4542-b86e-171e124c81f4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.534070 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e4d880f-9a11-4d82-b099-1fbd6cae11ec" (UID: "4e4d880f-9a11-4d82-b099-1fbd6cae11ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:22 crc kubenswrapper[4747]: I1128 13:24:22.598469 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e4d880f-9a11-4d82-b099-1fbd6cae11ec-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.062580 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" event={"ID":"6f46540d-f949-4ebd-aa09-0336f09ddfef","Type":"ContainerDied","Data":"002a33093848c0e8e648e36c64ab1afa2ad41dc1420b5cbd3b5ef0d65d2daa6f"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.062602 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dr67x" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.062688 4747 scope.go:117] "RemoveContainer" containerID="e7c4446189b44f89a77c16d2212e1ab5a21d69d3af35fd461a2d33fcbdebd2df" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.064741 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7khwx" event={"ID":"cca84f8d-3b79-44e5-8de8-af6bc47e7bba","Type":"ContainerDied","Data":"4d0de0425edc1cc1601032947ad347b993428692503d9653a82ced158f849ad2"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.064755 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7khwx" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.067232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvsv5" event={"ID":"4e4d880f-9a11-4d82-b099-1fbd6cae11ec","Type":"ContainerDied","Data":"7d16f3e9d6c81123f8e593204b7139d6e896b4d7238a7984d1f35c36cbac3385"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.067282 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvsv5" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.069554 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" event={"ID":"f8ce1410-e45d-4cb9-a8b3-de758929de4b","Type":"ContainerStarted","Data":"2ccfcd0d4513b2f5d1d8fbd39b9c9ae144e35122855695e6f8ecc0c93a0c9834"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.069588 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" event={"ID":"f8ce1410-e45d-4cb9-a8b3-de758929de4b","Type":"ContainerStarted","Data":"e5ddd21ea3502f0e28508190956e4ad1daf5801b4f2a0dc7828c053f094815af"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.069994 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.071804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" event={"ID":"9b8465de-0e54-41a1-986c-c56ac227e6d0","Type":"ContainerStarted","Data":"5d9bb0ddd11e40896ce01dd7748b34b4a044dd0c2d95a3034c2c4ccfedd74508"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.071845 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" event={"ID":"9b8465de-0e54-41a1-986c-c56ac227e6d0","Type":"ContainerStarted","Data":"a2e4303a2c267439805742f961808eb91936c957f0119b834bce91d9f5624b3f"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.072268 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.075941 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.077335 4747 scope.go:117] "RemoveContainer" containerID="907f61ef0c548664a6655890d1f078da6fa91d84c74e7cbdbbb7165d1d45bfd4" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.078479 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d995" event={"ID":"e05164c9-17fa-41ef-9abe-b00460c2cb96","Type":"ContainerDied","Data":"2c63f327222a11bd788fe082f336bf1722d28cac1bf3787a4f43e9dcc1e8ec0b"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.082610 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d995" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.086104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86pgk" event={"ID":"0a8a42c3-62aa-4542-b86e-171e124c81f4","Type":"ContainerDied","Data":"cf396f90350b7da06e1c9b8e7dc71438059298e0191bf8084b410e092c656d07"} Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.086229 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86pgk" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.107482 4747 scope.go:117] "RemoveContainer" containerID="5533dfd7e753a2af46fbc8912145eb0e0e40512d10146bb7ae1f5563e47142d1" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.109809 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" podStartSLOduration=2.109780211 podStartE2EDuration="2.109780211s" podCreationTimestamp="2025-11-28 13:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:23.10141098 +0000 UTC m=+315.763892710" watchObservedRunningTime="2025-11-28 13:24:23.109780211 +0000 UTC m=+315.772261951" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.123160 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bb4cp" podStartSLOduration=2.123079955 podStartE2EDuration="2.123079955s" podCreationTimestamp="2025-11-28 13:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:23.116158561 +0000 UTC m=+315.778640311" watchObservedRunningTime="2025-11-28 13:24:23.123079955 +0000 UTC m=+315.785561705" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.136411 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.146144 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvsv5"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.154455 4747 scope.go:117] "RemoveContainer" containerID="8c659bddf5231181c63ec8301a205fd500746c303500a0060a0375200b11f9c7" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.160969 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.170814 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6d995"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.172403 4747 scope.go:117] "RemoveContainer" containerID="8ac6e215e0585923eab640f34afcd9c8f2017e167b3a48ec06f6542870e85ed9" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.174108 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.177064 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7khwx"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.214842 4747 scope.go:117] "RemoveContainer" containerID="6cd42d7afc5412e9d018e3743be7dcc069f09e3f1b529525da639f804e8b019e" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.222333 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.238923 4747 scope.go:117] "RemoveContainer" containerID="de894167c4949aaf76f25ca2262846d4269831a558c3ec64ff9935e04a86ffbb" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.253405 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dr67x"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.255778 4747 scope.go:117] "RemoveContainer" containerID="cd9084b0d713eda79fa288e5c48b0d0ef50840c78894fad81451d4686129be55" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.258546 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.261555 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-86pgk"] Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.274157 4747 scope.go:117] "RemoveContainer" containerID="8d165f362e4e6b6740474da36e38658079514efae39e4f65f31b5b54af363d33" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.289590 4747 scope.go:117] "RemoveContainer" containerID="a4c56382690751d093e4d64086f0fe726471d7c917aa6adb347dd27a1c046368" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.302162 4747 scope.go:117] "RemoveContainer" containerID="5af70dbfe60b9acdeda3ce485d7c6f3319a3abe090d371d6b726ade2fdc121a3" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.321388 4747 scope.go:117] "RemoveContainer" containerID="69e8f940eda06060a16da7d49a02d66eda33de7b5b34971e2afcd6782c5c8dc0" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.336570 4747 scope.go:117] "RemoveContainer" containerID="a5748d6607e57a464a6a5ab646258e85ea62948b4fd124361109b479df4dfba1" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.650896 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" path="/var/lib/kubelet/pods/0a8a42c3-62aa-4542-b86e-171e124c81f4/volumes" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.652043 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" path="/var/lib/kubelet/pods/4e4d880f-9a11-4d82-b099-1fbd6cae11ec/volumes" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.652820 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" path="/var/lib/kubelet/pods/6f46540d-f949-4ebd-aa09-0336f09ddfef/volumes" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.653899 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" path="/var/lib/kubelet/pods/cca84f8d-3b79-44e5-8de8-af6bc47e7bba/volumes" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.654515 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" path="/var/lib/kubelet/pods/e05164c9-17fa-41ef-9abe-b00460c2cb96/volumes" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904221 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62j9r"] Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904454 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904468 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904482 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904489 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904500 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904507 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904517 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904526 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904535 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904542 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904555 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904562 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904573 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904580 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904589 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904609 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904620 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904628 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904638 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904645 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="extract-utilities" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904660 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904667 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904676 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904684 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904694 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904701 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: E1128 13:24:23.904711 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904719 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="extract-content" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904827 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05164c9-17fa-41ef-9abe-b00460c2cb96" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904839 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8a42c3-62aa-4542-b86e-171e124c81f4" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904851 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca84f8d-3b79-44e5-8de8-af6bc47e7bba" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904871 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4d880f-9a11-4d82-b099-1fbd6cae11ec" containerName="registry-server" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904885 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.904894 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f46540d-f949-4ebd-aa09-0336f09ddfef" containerName="marketplace-operator" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.905843 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.907675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:24:23 crc kubenswrapper[4747]: I1128 13:24:23.914383 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62j9r"] Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.015491 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-catalog-content\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.015820 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-utilities\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.015922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f2pf\" (UniqueName: \"kubernetes.io/projected/05d43de7-aac3-4012-b0d9-163896d07ffc-kube-api-access-4f2pf\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.108929 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q49jb"] Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.110095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.112018 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.116860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-catalog-content\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.116914 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-utilities\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.116954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f2pf\" (UniqueName: \"kubernetes.io/projected/05d43de7-aac3-4012-b0d9-163896d07ffc-kube-api-access-4f2pf\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.117766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-catalog-content\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.118025 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d43de7-aac3-4012-b0d9-163896d07ffc-utilities\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.134731 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q49jb"] Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.141411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f2pf\" (UniqueName: \"kubernetes.io/projected/05d43de7-aac3-4012-b0d9-163896d07ffc-kube-api-access-4f2pf\") pod \"redhat-marketplace-62j9r\" (UID: \"05d43de7-aac3-4012-b0d9-163896d07ffc\") " pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.218198 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-utilities\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.218570 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dl6k\" (UniqueName: \"kubernetes.io/projected/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-kube-api-access-6dl6k\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.218680 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-catalog-content\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.219604 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.320284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dl6k\" (UniqueName: \"kubernetes.io/projected/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-kube-api-access-6dl6k\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.320596 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-catalog-content\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.320633 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-utilities\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.321588 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-utilities\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.321723 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-catalog-content\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.340028 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dl6k\" (UniqueName: \"kubernetes.io/projected/d42b6c41-5b4a-4044-a66b-80fcdb3e9574-kube-api-access-6dl6k\") pod \"redhat-operators-q49jb\" (UID: \"d42b6c41-5b4a-4044-a66b-80fcdb3e9574\") " pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.429644 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.623040 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62j9r"] Nov 28 13:24:24 crc kubenswrapper[4747]: I1128 13:24:24.671515 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q49jb"] Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.094063 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.094672 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" podUID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" containerName="controller-manager" containerID="cri-o://4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da" gracePeriod=30 Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.105394 4747 generic.go:334] "Generic (PLEG): container finished" podID="d42b6c41-5b4a-4044-a66b-80fcdb3e9574" containerID="f9324788057b56736989efb4e27b0869cd0377f78efb26607a480477432865cd" exitCode=0 Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.105751 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q49jb" event={"ID":"d42b6c41-5b4a-4044-a66b-80fcdb3e9574","Type":"ContainerDied","Data":"f9324788057b56736989efb4e27b0869cd0377f78efb26607a480477432865cd"} Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.105908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q49jb" event={"ID":"d42b6c41-5b4a-4044-a66b-80fcdb3e9574","Type":"ContainerStarted","Data":"8fe135d97eaafe8f5484b10be4664546964e83169c5156f32c44a94f06f1193b"} Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.108911 4747 generic.go:334] "Generic (PLEG): container finished" podID="05d43de7-aac3-4012-b0d9-163896d07ffc" containerID="e5d62e72bb93930376a950fbbe73228afea9a035519b1d055865bf6da03ed587" exitCode=0 Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.109392 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62j9r" event={"ID":"05d43de7-aac3-4012-b0d9-163896d07ffc","Type":"ContainerDied","Data":"e5d62e72bb93930376a950fbbe73228afea9a035519b1d055865bf6da03ed587"} Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.109449 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62j9r" event={"ID":"05d43de7-aac3-4012-b0d9-163896d07ffc","Type":"ContainerStarted","Data":"e071d2e3dbd1a5ff0d250935ae7e0e78eeaa4e0303781e1311d2d99334ecc828"} Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.188391 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.188645 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" podUID="3d75d9b7-7aaf-4323-8712-e48e221128d3" containerName="route-controller-manager" containerID="cri-o://c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f" gracePeriod=30 Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.510241 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.569554 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n2gn\" (UniqueName: \"kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn\") pod \"3d75d9b7-7aaf-4323-8712-e48e221128d3\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652569 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca\") pod \"3d75d9b7-7aaf-4323-8712-e48e221128d3\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652598 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnjd5\" (UniqueName: \"kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5\") pod \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652619 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config\") pod \"3d75d9b7-7aaf-4323-8712-e48e221128d3\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert\") pod \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652674 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config\") pod \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652699 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles\") pod \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652738 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca\") pod \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\" (UID: \"89310d93-7fdd-4c40-be1d-3da6c18c9f51\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.652761 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert\") pod \"3d75d9b7-7aaf-4323-8712-e48e221128d3\" (UID: \"3d75d9b7-7aaf-4323-8712-e48e221128d3\") " Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.653998 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "89310d93-7fdd-4c40-be1d-3da6c18c9f51" (UID: "89310d93-7fdd-4c40-be1d-3da6c18c9f51"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.654275 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca" (OuterVolumeSpecName: "client-ca") pod "89310d93-7fdd-4c40-be1d-3da6c18c9f51" (UID: "89310d93-7fdd-4c40-be1d-3da6c18c9f51"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.654611 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config" (OuterVolumeSpecName: "config") pod "89310d93-7fdd-4c40-be1d-3da6c18c9f51" (UID: "89310d93-7fdd-4c40-be1d-3da6c18c9f51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.655072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d75d9b7-7aaf-4323-8712-e48e221128d3" (UID: "3d75d9b7-7aaf-4323-8712-e48e221128d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.655175 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config" (OuterVolumeSpecName: "config") pod "3d75d9b7-7aaf-4323-8712-e48e221128d3" (UID: "3d75d9b7-7aaf-4323-8712-e48e221128d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.658027 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d75d9b7-7aaf-4323-8712-e48e221128d3" (UID: "3d75d9b7-7aaf-4323-8712-e48e221128d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.658053 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5" (OuterVolumeSpecName: "kube-api-access-xnjd5") pod "89310d93-7fdd-4c40-be1d-3da6c18c9f51" (UID: "89310d93-7fdd-4c40-be1d-3da6c18c9f51"). InnerVolumeSpecName "kube-api-access-xnjd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.658544 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89310d93-7fdd-4c40-be1d-3da6c18c9f51" (UID: "89310d93-7fdd-4c40-be1d-3da6c18c9f51"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.658724 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn" (OuterVolumeSpecName: "kube-api-access-9n2gn") pod "3d75d9b7-7aaf-4323-8712-e48e221128d3" (UID: "3d75d9b7-7aaf-4323-8712-e48e221128d3"). InnerVolumeSpecName "kube-api-access-9n2gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755057 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n2gn\" (UniqueName: \"kubernetes.io/projected/3d75d9b7-7aaf-4323-8712-e48e221128d3-kube-api-access-9n2gn\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755099 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755114 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnjd5\" (UniqueName: \"kubernetes.io/projected/89310d93-7fdd-4c40-be1d-3da6c18c9f51-kube-api-access-xnjd5\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755127 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d75d9b7-7aaf-4323-8712-e48e221128d3-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755139 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89310d93-7fdd-4c40-be1d-3da6c18c9f51-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755152 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755163 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755174 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89310d93-7fdd-4c40-be1d-3da6c18c9f51-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:25 crc kubenswrapper[4747]: I1128 13:24:25.755184 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d75d9b7-7aaf-4323-8712-e48e221128d3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.121520 4747 generic.go:334] "Generic (PLEG): container finished" podID="3d75d9b7-7aaf-4323-8712-e48e221128d3" containerID="c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f" exitCode=0 Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.121569 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.121625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" event={"ID":"3d75d9b7-7aaf-4323-8712-e48e221128d3","Type":"ContainerDied","Data":"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f"} Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.121675 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w" event={"ID":"3d75d9b7-7aaf-4323-8712-e48e221128d3","Type":"ContainerDied","Data":"0f59647c577fbfe90707393696f4b05ed9e11cac1cd3402dc64aec8c0d5b0214"} Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.121698 4747 scope.go:117] "RemoveContainer" containerID="c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.124551 4747 generic.go:334] "Generic (PLEG): container finished" podID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" containerID="4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da" exitCode=0 Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.124601 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.124604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" event={"ID":"89310d93-7fdd-4c40-be1d-3da6c18c9f51","Type":"ContainerDied","Data":"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da"} Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.124752 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c7d768ccb-fln9d" event={"ID":"89310d93-7fdd-4c40-be1d-3da6c18c9f51","Type":"ContainerDied","Data":"7983955f34a8b94ab419fbfce9c9c12d21372de60a7d04cf5457e7f53fd4232f"} Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.137236 4747 scope.go:117] "RemoveContainer" containerID="c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f" Nov 28 13:24:26 crc kubenswrapper[4747]: E1128 13:24:26.137681 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f\": container with ID starting with c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f not found: ID does not exist" containerID="c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.137725 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f"} err="failed to get container status \"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f\": rpc error: code = NotFound desc = could not find container \"c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f\": container with ID starting with c27c3276c30430bfd42a67fce5ef83206f36f1b0f4a49a5a6725a22c651fdc0f not found: ID does not exist" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.137748 4747 scope.go:117] "RemoveContainer" containerID="4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.158958 4747 scope.go:117] "RemoveContainer" containerID="4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da" Nov 28 13:24:26 crc kubenswrapper[4747]: E1128 13:24:26.160798 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da\": container with ID starting with 4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da not found: ID does not exist" containerID="4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.160858 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da"} err="failed to get container status \"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da\": rpc error: code = NotFound desc = could not find container \"4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da\": container with ID starting with 4957a1f34229a5687ab589d4151a6393517d9cd21221597d5ff0055e904a95da not found: ID does not exist" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.162499 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.165305 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb84bb5d-2qt2w"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.173584 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.178581 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c7d768ccb-fln9d"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.311299 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5nz5c"] Nov 28 13:24:26 crc kubenswrapper[4747]: E1128 13:24:26.311502 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" containerName="controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.311514 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" containerName="controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: E1128 13:24:26.311531 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d75d9b7-7aaf-4323-8712-e48e221128d3" containerName="route-controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.311538 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d75d9b7-7aaf-4323-8712-e48e221128d3" containerName="route-controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.311632 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" containerName="controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.311648 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d75d9b7-7aaf-4323-8712-e48e221128d3" containerName="route-controller-manager" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.312477 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.317164 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.319079 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nz5c"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.465309 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-catalog-content\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.465486 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-utilities\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.465530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqp8\" (UniqueName: \"kubernetes.io/projected/b47935e1-b026-44cb-8e7c-518913365e82-kube-api-access-jsqp8\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.513199 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vf94"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.514460 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.525033 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.538044 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vf94"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.567883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-utilities\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.567957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsqp8\" (UniqueName: \"kubernetes.io/projected/b47935e1-b026-44cb-8e7c-518913365e82-kube-api-access-jsqp8\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.568067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-catalog-content\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.568600 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-utilities\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.568622 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b47935e1-b026-44cb-8e7c-518913365e82-catalog-content\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.598442 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsqp8\" (UniqueName: \"kubernetes.io/projected/b47935e1-b026-44cb-8e7c-518913365e82-kube-api-access-jsqp8\") pod \"certified-operators-5nz5c\" (UID: \"b47935e1-b026-44cb-8e7c-518913365e82\") " pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.628382 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.659243 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.659848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.662649 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.662707 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.663092 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.663567 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.663653 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.664202 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.672410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-catalog-content\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.672486 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr26f\" (UniqueName: \"kubernetes.io/projected/10510a33-c8cf-4796-ac7c-26095e641b73-kube-api-access-cr26f\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.672567 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-utilities\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.684004 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.685792 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.693414 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.693444 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.694018 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.694032 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.694098 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.694346 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.700665 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.707982 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.711709 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.780675 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr26f\" (UniqueName: \"kubernetes.io/projected/10510a33-c8cf-4796-ac7c-26095e641b73-kube-api-access-cr26f\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781109 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-client-ca\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781148 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-utilities\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-proxy-ca-bundles\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781234 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27806617-bac0-4ce1-808b-2beffb6db404-serving-cert\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781312 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-client-ca\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781337 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-config\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781363 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-config\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781423 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e06caef-5e8c-445d-ba47-561507f558ac-serving-cert\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781476 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49qz\" (UniqueName: \"kubernetes.io/projected/27806617-bac0-4ce1-808b-2beffb6db404-kube-api-access-b49qz\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntpt\" (UniqueName: \"kubernetes.io/projected/7e06caef-5e8c-445d-ba47-561507f558ac-kube-api-access-lntpt\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.781764 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-catalog-content\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.782132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-utilities\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.782239 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10510a33-c8cf-4796-ac7c-26095e641b73-catalog-content\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.804659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr26f\" (UniqueName: \"kubernetes.io/projected/10510a33-c8cf-4796-ac7c-26095e641b73-kube-api-access-cr26f\") pod \"community-operators-9vf94\" (UID: \"10510a33-c8cf-4796-ac7c-26095e641b73\") " pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.841891 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.883585 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nz5c"] Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.883863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-client-ca\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.883920 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-proxy-ca-bundles\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.883945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27806617-bac0-4ce1-808b-2beffb6db404-serving-cert\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.883980 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-client-ca\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.884007 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-config\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.884030 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-config\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.884055 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e06caef-5e8c-445d-ba47-561507f558ac-serving-cert\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.884112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49qz\" (UniqueName: \"kubernetes.io/projected/27806617-bac0-4ce1-808b-2beffb6db404-kube-api-access-b49qz\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.884140 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntpt\" (UniqueName: \"kubernetes.io/projected/7e06caef-5e8c-445d-ba47-561507f558ac-kube-api-access-lntpt\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.885129 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-client-ca\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.885387 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-client-ca\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.885550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-proxy-ca-bundles\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.885844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27806617-bac0-4ce1-808b-2beffb6db404-config\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.891024 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e06caef-5e8c-445d-ba47-561507f558ac-serving-cert\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.897677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e06caef-5e8c-445d-ba47-561507f558ac-config\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.901571 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntpt\" (UniqueName: \"kubernetes.io/projected/7e06caef-5e8c-445d-ba47-561507f558ac-kube-api-access-lntpt\") pod \"route-controller-manager-9c4b6548-ktzsx\" (UID: \"7e06caef-5e8c-445d-ba47-561507f558ac\") " pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.902402 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27806617-bac0-4ce1-808b-2beffb6db404-serving-cert\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:26 crc kubenswrapper[4747]: I1128 13:24:26.903618 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49qz\" (UniqueName: \"kubernetes.io/projected/27806617-bac0-4ce1-808b-2beffb6db404-kube-api-access-b49qz\") pod \"controller-manager-5b7b96cc48-jhv2v\" (UID: \"27806617-bac0-4ce1-808b-2beffb6db404\") " pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:26.999609 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.013963 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.080760 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vf94"] Nov 28 13:24:27 crc kubenswrapper[4747]: W1128 13:24:27.120585 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10510a33_c8cf_4796_ac7c_26095e641b73.slice/crio-bcefc41d69dac9834011889fc176485a17ce538baefb843e51b348abde9d9f3e WatchSource:0}: Error finding container bcefc41d69dac9834011889fc176485a17ce538baefb843e51b348abde9d9f3e: Status 404 returned error can't find the container with id bcefc41d69dac9834011889fc176485a17ce538baefb843e51b348abde9d9f3e Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.153174 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nz5c" event={"ID":"b47935e1-b026-44cb-8e7c-518913365e82","Type":"ContainerStarted","Data":"5b2b367814bc0550d6c771e5f18ad55b24e7c3d8cdaa8638552a9c134d882b22"} Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.153957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nz5c" event={"ID":"b47935e1-b026-44cb-8e7c-518913365e82","Type":"ContainerStarted","Data":"110c7b9042e12281b549abe95b56eaaed1c1cbf1b9bb42a7b2c65ede49662731"} Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.163459 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vf94" event={"ID":"10510a33-c8cf-4796-ac7c-26095e641b73","Type":"ContainerStarted","Data":"bcefc41d69dac9834011889fc176485a17ce538baefb843e51b348abde9d9f3e"} Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.431594 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx"] Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.463281 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v"] Nov 28 13:24:27 crc kubenswrapper[4747]: W1128 13:24:27.488240 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27806617_bac0_4ce1_808b_2beffb6db404.slice/crio-0a77f4b811a909e05179af61a0988ee7f4a0433ac1f8f1233a987074302ce291 WatchSource:0}: Error finding container 0a77f4b811a909e05179af61a0988ee7f4a0433ac1f8f1233a987074302ce291: Status 404 returned error can't find the container with id 0a77f4b811a909e05179af61a0988ee7f4a0433ac1f8f1233a987074302ce291 Nov 28 13:24:27 crc kubenswrapper[4747]: W1128 13:24:27.488801 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e06caef_5e8c_445d_ba47_561507f558ac.slice/crio-8fc5abdb8907b7d665455ccb1832a9e08b981cca82aeb1b4fe55b79d75214f00 WatchSource:0}: Error finding container 8fc5abdb8907b7d665455ccb1832a9e08b981cca82aeb1b4fe55b79d75214f00: Status 404 returned error can't find the container with id 8fc5abdb8907b7d665455ccb1832a9e08b981cca82aeb1b4fe55b79d75214f00 Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.655678 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d75d9b7-7aaf-4323-8712-e48e221128d3" path="/var/lib/kubelet/pods/3d75d9b7-7aaf-4323-8712-e48e221128d3/volumes" Nov 28 13:24:27 crc kubenswrapper[4747]: I1128 13:24:27.656975 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89310d93-7fdd-4c40-be1d-3da6c18c9f51" path="/var/lib/kubelet/pods/89310d93-7fdd-4c40-be1d-3da6c18c9f51/volumes" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.172944 4747 generic.go:334] "Generic (PLEG): container finished" podID="10510a33-c8cf-4796-ac7c-26095e641b73" containerID="e8b11e23b89fb1dfe180620f9175ae187b63062ee99093feb108a76139003c0e" exitCode=0 Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.173027 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vf94" event={"ID":"10510a33-c8cf-4796-ac7c-26095e641b73","Type":"ContainerDied","Data":"e8b11e23b89fb1dfe180620f9175ae187b63062ee99093feb108a76139003c0e"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.175800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" event={"ID":"7e06caef-5e8c-445d-ba47-561507f558ac","Type":"ContainerStarted","Data":"48cd269ce9c7a6ece1a72696665dcbd29e153236f3344f363b986a5029d95938"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.175891 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" event={"ID":"7e06caef-5e8c-445d-ba47-561507f558ac","Type":"ContainerStarted","Data":"8fc5abdb8907b7d665455ccb1832a9e08b981cca82aeb1b4fe55b79d75214f00"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.175914 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.177952 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" event={"ID":"27806617-bac0-4ce1-808b-2beffb6db404","Type":"ContainerStarted","Data":"e783387d2731c56e276980ef0e77e648b6172073d61691374eeb554723a9e348"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.177995 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.178007 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" event={"ID":"27806617-bac0-4ce1-808b-2beffb6db404","Type":"ContainerStarted","Data":"0a77f4b811a909e05179af61a0988ee7f4a0433ac1f8f1233a987074302ce291"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.180580 4747 generic.go:334] "Generic (PLEG): container finished" podID="d42b6c41-5b4a-4044-a66b-80fcdb3e9574" containerID="fda340fdacf34363349fabdded3edcdfed48cdf587db2689c28d87bf44267f57" exitCode=0 Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.180685 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q49jb" event={"ID":"d42b6c41-5b4a-4044-a66b-80fcdb3e9574","Type":"ContainerDied","Data":"fda340fdacf34363349fabdded3edcdfed48cdf587db2689c28d87bf44267f57"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.182627 4747 generic.go:334] "Generic (PLEG): container finished" podID="b47935e1-b026-44cb-8e7c-518913365e82" containerID="5b2b367814bc0550d6c771e5f18ad55b24e7c3d8cdaa8638552a9c134d882b22" exitCode=0 Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.182702 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nz5c" event={"ID":"b47935e1-b026-44cb-8e7c-518913365e82","Type":"ContainerDied","Data":"5b2b367814bc0550d6c771e5f18ad55b24e7c3d8cdaa8638552a9c134d882b22"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.185706 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.188125 4747 generic.go:334] "Generic (PLEG): container finished" podID="05d43de7-aac3-4012-b0d9-163896d07ffc" containerID="5b9e63c73ab83b709344069ec5758eec7720c5f5a70a8b3b66723f80c975a308" exitCode=0 Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.188187 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62j9r" event={"ID":"05d43de7-aac3-4012-b0d9-163896d07ffc","Type":"ContainerDied","Data":"5b9e63c73ab83b709344069ec5758eec7720c5f5a70a8b3b66723f80c975a308"} Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.240916 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" podStartSLOduration=3.240892408 podStartE2EDuration="3.240892408s" podCreationTimestamp="2025-11-28 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:28.235899032 +0000 UTC m=+320.898380762" watchObservedRunningTime="2025-11-28 13:24:28.240892408 +0000 UTC m=+320.903374148" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.311112 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b7b96cc48-jhv2v" podStartSLOduration=3.311095244 podStartE2EDuration="3.311095244s" podCreationTimestamp="2025-11-28 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:28.296734393 +0000 UTC m=+320.959216123" watchObservedRunningTime="2025-11-28 13:24:28.311095244 +0000 UTC m=+320.973576974" Nov 28 13:24:28 crc kubenswrapper[4747]: I1128 13:24:28.436183 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9c4b6548-ktzsx" Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.195222 4747 generic.go:334] "Generic (PLEG): container finished" podID="b47935e1-b026-44cb-8e7c-518913365e82" containerID="7d509ef06604268252d6bc5ffcfb67bc1899867a0ff954f9960adc2451278049" exitCode=0 Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.195276 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nz5c" event={"ID":"b47935e1-b026-44cb-8e7c-518913365e82","Type":"ContainerDied","Data":"7d509ef06604268252d6bc5ffcfb67bc1899867a0ff954f9960adc2451278049"} Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.198584 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62j9r" event={"ID":"05d43de7-aac3-4012-b0d9-163896d07ffc","Type":"ContainerStarted","Data":"99b34560b7c12b49f89901969041993c7c9a47586f76a57a226b96e02f040458"} Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.200941 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vf94" event={"ID":"10510a33-c8cf-4796-ac7c-26095e641b73","Type":"ContainerStarted","Data":"5631d0747f2ece28736c18161d48b36a323266ecfb0e1ee1ba17070a270065fc"} Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.203177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q49jb" event={"ID":"d42b6c41-5b4a-4044-a66b-80fcdb3e9574","Type":"ContainerStarted","Data":"03f0717265576d5e720cffc2f68b67ec79da23fccd4f73beead732bdc15ddc53"} Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.250965 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62j9r" podStartSLOduration=2.558713296 podStartE2EDuration="6.250946057s" podCreationTimestamp="2025-11-28 13:24:23 +0000 UTC" firstStartedPulling="2025-11-28 13:24:25.111196238 +0000 UTC m=+317.773678008" lastFinishedPulling="2025-11-28 13:24:28.803429039 +0000 UTC m=+321.465910769" observedRunningTime="2025-11-28 13:24:29.248387333 +0000 UTC m=+321.910869063" watchObservedRunningTime="2025-11-28 13:24:29.250946057 +0000 UTC m=+321.913427787" Nov 28 13:24:29 crc kubenswrapper[4747]: I1128 13:24:29.271545 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q49jb" podStartSLOduration=1.626180194 podStartE2EDuration="5.271528595s" podCreationTimestamp="2025-11-28 13:24:24 +0000 UTC" firstStartedPulling="2025-11-28 13:24:25.107726071 +0000 UTC m=+317.770207831" lastFinishedPulling="2025-11-28 13:24:28.753074502 +0000 UTC m=+321.415556232" observedRunningTime="2025-11-28 13:24:29.270432187 +0000 UTC m=+321.932913917" watchObservedRunningTime="2025-11-28 13:24:29.271528595 +0000 UTC m=+321.934010325" Nov 28 13:24:30 crc kubenswrapper[4747]: I1128 13:24:30.209396 4747 generic.go:334] "Generic (PLEG): container finished" podID="10510a33-c8cf-4796-ac7c-26095e641b73" containerID="5631d0747f2ece28736c18161d48b36a323266ecfb0e1ee1ba17070a270065fc" exitCode=0 Nov 28 13:24:30 crc kubenswrapper[4747]: I1128 13:24:30.209452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vf94" event={"ID":"10510a33-c8cf-4796-ac7c-26095e641b73","Type":"ContainerDied","Data":"5631d0747f2ece28736c18161d48b36a323266ecfb0e1ee1ba17070a270065fc"} Nov 28 13:24:30 crc kubenswrapper[4747]: I1128 13:24:30.216681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nz5c" event={"ID":"b47935e1-b026-44cb-8e7c-518913365e82","Type":"ContainerStarted","Data":"b4347d559eb43fceaed6e1c36b3426177152867939b8653ecad760b9af8dc5a6"} Nov 28 13:24:30 crc kubenswrapper[4747]: I1128 13:24:30.249609 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5nz5c" podStartSLOduration=1.7501735219999999 podStartE2EDuration="4.249589698s" podCreationTimestamp="2025-11-28 13:24:26 +0000 UTC" firstStartedPulling="2025-11-28 13:24:27.168600813 +0000 UTC m=+319.831082543" lastFinishedPulling="2025-11-28 13:24:29.668016989 +0000 UTC m=+322.330498719" observedRunningTime="2025-11-28 13:24:30.249263009 +0000 UTC m=+322.911744749" watchObservedRunningTime="2025-11-28 13:24:30.249589698 +0000 UTC m=+322.912071418" Nov 28 13:24:31 crc kubenswrapper[4747]: I1128 13:24:31.224346 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vf94" event={"ID":"10510a33-c8cf-4796-ac7c-26095e641b73","Type":"ContainerStarted","Data":"4dad1dca999112f12ecc9df2d145a27f74ec73c87bc9cc5239119c7b9e09c99a"} Nov 28 13:24:31 crc kubenswrapper[4747]: I1128 13:24:31.240660 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vf94" podStartSLOduration=2.731148571 podStartE2EDuration="5.240641209s" podCreationTimestamp="2025-11-28 13:24:26 +0000 UTC" firstStartedPulling="2025-11-28 13:24:28.175477342 +0000 UTC m=+320.837959072" lastFinishedPulling="2025-11-28 13:24:30.68496996 +0000 UTC m=+323.347451710" observedRunningTime="2025-11-28 13:24:31.239222653 +0000 UTC m=+323.901704383" watchObservedRunningTime="2025-11-28 13:24:31.240641209 +0000 UTC m=+323.903122939" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.220153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.220595 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.267068 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.306647 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62j9r" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.429852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.429927 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:34 crc kubenswrapper[4747]: I1128 13:24:34.499348 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:35 crc kubenswrapper[4747]: I1128 13:24:35.304054 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q49jb" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.628618 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.628981 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.691984 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.842673 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.842830 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:36 crc kubenswrapper[4747]: I1128 13:24:36.899438 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:37 crc kubenswrapper[4747]: I1128 13:24:37.323248 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5nz5c" Nov 28 13:24:37 crc kubenswrapper[4747]: I1128 13:24:37.324432 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vf94" Nov 28 13:24:41 crc kubenswrapper[4747]: I1128 13:24:41.622501 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gccdc" Nov 28 13:24:41 crc kubenswrapper[4747]: I1128 13:24:41.698017 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:24:47 crc kubenswrapper[4747]: I1128 13:24:47.633289 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:24:47 crc kubenswrapper[4747]: I1128 13:24:47.633748 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:25:06 crc kubenswrapper[4747]: I1128 13:25:06.743799 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" podUID="a0c345d3-2efb-458e-9b68-52c46be2279c" containerName="registry" containerID="cri-o://d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab" gracePeriod=30 Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.126028 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261400 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261542 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261581 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261721 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261771 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwnpx\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.261872 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.262099 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a0c345d3-2efb-458e-9b68-52c46be2279c\" (UID: \"a0c345d3-2efb-458e-9b68-52c46be2279c\") " Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.262792 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.263367 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.271723 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.271988 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.272298 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx" (OuterVolumeSpecName: "kube-api-access-qwnpx") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "kube-api-access-qwnpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.276710 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.277130 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.295079 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a0c345d3-2efb-458e-9b68-52c46be2279c" (UID: "a0c345d3-2efb-458e-9b68-52c46be2279c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364650 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364711 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364734 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0c345d3-2efb-458e-9b68-52c46be2279c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364753 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0c345d3-2efb-458e-9b68-52c46be2279c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364774 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364791 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwnpx\" (UniqueName: \"kubernetes.io/projected/a0c345d3-2efb-458e-9b68-52c46be2279c-kube-api-access-qwnpx\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.364810 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0c345d3-2efb-458e-9b68-52c46be2279c-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.457864 4747 generic.go:334] "Generic (PLEG): container finished" podID="a0c345d3-2efb-458e-9b68-52c46be2279c" containerID="d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab" exitCode=0 Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.458032 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.458030 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" event={"ID":"a0c345d3-2efb-458e-9b68-52c46be2279c","Type":"ContainerDied","Data":"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab"} Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.458444 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jms9b" event={"ID":"a0c345d3-2efb-458e-9b68-52c46be2279c","Type":"ContainerDied","Data":"1b5b92b320762faf392e66825bdf2e97f6cec91b78060a72937f964de80b5dfb"} Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.458481 4747 scope.go:117] "RemoveContainer" containerID="d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.494363 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.497199 4747 scope.go:117] "RemoveContainer" containerID="d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab" Nov 28 13:25:07 crc kubenswrapper[4747]: E1128 13:25:07.497951 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab\": container with ID starting with d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab not found: ID does not exist" containerID="d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.498006 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab"} err="failed to get container status \"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab\": rpc error: code = NotFound desc = could not find container \"d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab\": container with ID starting with d4aa1b43715d17dfdd9407db43bd5d3faaf6f97fff74b4de95243704107384ab not found: ID does not exist" Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.502339 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jms9b"] Nov 28 13:25:07 crc kubenswrapper[4747]: I1128 13:25:07.662327 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c345d3-2efb-458e-9b68-52c46be2279c" path="/var/lib/kubelet/pods/a0c345d3-2efb-458e-9b68-52c46be2279c/volumes" Nov 28 13:25:17 crc kubenswrapper[4747]: I1128 13:25:17.633438 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:25:17 crc kubenswrapper[4747]: I1128 13:25:17.634005 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:25:47 crc kubenswrapper[4747]: I1128 13:25:47.633510 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:25:47 crc kubenswrapper[4747]: I1128 13:25:47.634636 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:25:47 crc kubenswrapper[4747]: I1128 13:25:47.634910 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:25:47 crc kubenswrapper[4747]: I1128 13:25:47.635963 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:25:47 crc kubenswrapper[4747]: I1128 13:25:47.636053 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6" gracePeriod=600 Nov 28 13:25:48 crc kubenswrapper[4747]: I1128 13:25:48.753744 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6" exitCode=0 Nov 28 13:25:48 crc kubenswrapper[4747]: I1128 13:25:48.753884 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6"} Nov 28 13:25:48 crc kubenswrapper[4747]: I1128 13:25:48.754243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f"} Nov 28 13:25:48 crc kubenswrapper[4747]: I1128 13:25:48.754289 4747 scope.go:117] "RemoveContainer" containerID="31295fd467714309c3c51e63384ec7b5216b0539d01c6383d6f0fd40fcc4aefd" Nov 28 13:26:07 crc kubenswrapper[4747]: I1128 13:26:07.870809 4747 scope.go:117] "RemoveContainer" containerID="62a51bba7c8a667a83cdfb311267811729d1ecb711b757f386d989239d03c051" Nov 28 13:26:07 crc kubenswrapper[4747]: I1128 13:26:07.888782 4747 scope.go:117] "RemoveContainer" containerID="9780dfa1615a45ee8341518471a66e629e8b66bb4b214b80c6da6c3b361b84cf" Nov 28 13:26:07 crc kubenswrapper[4747]: I1128 13:26:07.921840 4747 scope.go:117] "RemoveContainer" containerID="fc68970f86b26fb277caf284260857d82e39ad4d0f0fe83de57dd7ddde36de49" Nov 28 13:26:07 crc kubenswrapper[4747]: I1128 13:26:07.986067 4747 scope.go:117] "RemoveContainer" containerID="6e2a5a9c54da8d2fbc7a485c2ddd88876682b10be091b190b4a7f00525593833" Nov 28 13:26:08 crc kubenswrapper[4747]: I1128 13:26:08.018541 4747 scope.go:117] "RemoveContainer" containerID="50b7a5462beb47941ec193788e0ad41996b51688bb7be91bbf81c22c5e54848a" Nov 28 13:26:08 crc kubenswrapper[4747]: I1128 13:26:08.036687 4747 scope.go:117] "RemoveContainer" containerID="0a7896685895aa4e3f6d4fa473ea464146fa29c04419e6e5bd4de9db290cf193" Nov 28 13:27:47 crc kubenswrapper[4747]: I1128 13:27:47.632637 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:27:47 crc kubenswrapper[4747]: I1128 13:27:47.633514 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:28:17 crc kubenswrapper[4747]: I1128 13:28:17.633581 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:28:17 crc kubenswrapper[4747]: I1128 13:28:17.634375 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.632844 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.633974 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.634060 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.634985 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.635259 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f" gracePeriod=600 Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.966806 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f" exitCode=0 Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.966901 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f"} Nov 28 13:28:47 crc kubenswrapper[4747]: I1128 13:28:47.967009 4747 scope.go:117] "RemoveContainer" containerID="ce863bbe790d4baf6afaf9f339a317c74a3f3d4a309ae619b5d042b46992a7f6" Nov 28 13:28:48 crc kubenswrapper[4747]: I1128 13:28:48.978589 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d"} Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.896883 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hb2vp"] Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898499 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-controller" containerID="cri-o://0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898640 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898638 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="nbdb" containerID="cri-o://94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898715 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-node" containerID="cri-o://76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898831 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="northd" containerID="cri-o://262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898721 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-acl-logging" containerID="cri-o://4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.898970 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="sbdb" containerID="cri-o://7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" gracePeriod=30 Nov 28 13:29:51 crc kubenswrapper[4747]: I1128 13:29:51.983519 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" containerID="cri-o://6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" gracePeriod=30 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.230765 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/2.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.236440 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovn-acl-logging/0.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.237104 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovn-controller/0.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.237566 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282701 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282776 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282806 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282849 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282877 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282910 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282942 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282963 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.282997 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283015 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283035 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rlqb\" (UniqueName: \"kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283077 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283099 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283112 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283159 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283180 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283184 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283196 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283243 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283252 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283266 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283274 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283295 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash" (OuterVolumeSpecName: "host-slash") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283284 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd\") pod \"a52417df-b828-4251-a786-afae5d1aa9fd\" (UID: \"a52417df-b828-4251-a786-afae5d1aa9fd\") " Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283554 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283570 4747 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-slash\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283582 4747 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283592 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283638 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283656 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283667 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283703 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283712 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.283732 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log" (OuterVolumeSpecName: "node-log") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284155 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284231 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket" (OuterVolumeSpecName: "log-socket") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284255 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284387 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284757 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284800 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ptllq"] Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.284849 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.284987 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="sbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285005 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="sbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285013 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kubecfg-setup" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285021 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kubecfg-setup" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285029 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="northd" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285035 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="northd" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285045 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="nbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285050 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="nbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285058 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c345d3-2efb-458e-9b68-52c46be2279c" containerName="registry" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285064 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c345d3-2efb-458e-9b68-52c46be2279c" containerName="registry" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285076 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285082 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285089 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285095 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285104 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-acl-logging" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285111 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-acl-logging" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285119 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-node" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285125 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-node" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285138 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285143 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285151 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285157 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285162 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285167 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285274 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="sbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285285 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285292 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="nbdb" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285298 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285304 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c345d3-2efb-458e-9b68-52c46be2279c" containerName="registry" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285312 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-node" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285320 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285326 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="northd" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285333 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285339 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovn-acl-logging" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285346 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.285452 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285459 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.285540 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" containerName="ovnkube-controller" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.286971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.291590 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb" (OuterVolumeSpecName: "kube-api-access-9rlqb") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "kube-api-access-9rlqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.293054 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.298589 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a52417df-b828-4251-a786-afae5d1aa9fd" (UID: "a52417df-b828-4251-a786-afae5d1aa9fd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385227 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-config\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385289 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-netd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385328 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rhnv\" (UniqueName: \"kubernetes.io/projected/286e332a-f287-4d77-9843-85fc3700f03e-kube-api-access-7rhnv\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385377 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-ovn\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385409 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-env-overrides\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385438 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-script-lib\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385471 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-var-lib-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385503 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-node-log\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385535 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-bin\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385600 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-log-socket\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385628 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-slash\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385658 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-etc-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385687 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/286e332a-f287-4d77-9843-85fc3700f03e-ovn-node-metrics-cert\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385720 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-systemd-units\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385753 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385789 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-systemd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-netns\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385875 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-kubelet\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385949 4747 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.385979 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a52417df-b828-4251-a786-afae5d1aa9fd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386004 4747 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386022 4747 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-log-socket\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386040 4747 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386057 4747 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386074 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386092 4747 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386109 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386126 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386143 4747 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386160 4747 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386177 4747 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-node-log\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386195 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a52417df-b828-4251-a786-afae5d1aa9fd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386240 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a52417df-b828-4251-a786-afae5d1aa9fd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.386258 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rlqb\" (UniqueName: \"kubernetes.io/projected/a52417df-b828-4251-a786-afae5d1aa9fd-kube-api-access-9rlqb\") on node \"crc\" DevicePath \"\"" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.406865 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/1.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.407727 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/0.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.407790 4747 generic.go:334] "Generic (PLEG): container finished" podID="11d91e3e-309b-4e83-9b0c-1f589c7670f6" containerID="9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234" exitCode=2 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.407871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerDied","Data":"9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.407925 4747 scope.go:117] "RemoveContainer" containerID="65dd25be31296be6db2596076451f5a63b417321468737598b42908a8cd716df" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.408957 4747 scope.go:117] "RemoveContainer" containerID="9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.409356 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-78psz_openshift-multus(11d91e3e-309b-4e83-9b0c-1f589c7670f6)\"" pod="openshift-multus/multus-78psz" podUID="11d91e3e-309b-4e83-9b0c-1f589c7670f6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.413881 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovnkube-controller/2.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.416490 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovn-acl-logging/0.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417030 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb2vp_a52417df-b828-4251-a786-afae5d1aa9fd/ovn-controller/0.log" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417498 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417518 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417525 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417532 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417541 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417548 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" exitCode=0 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417555 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" exitCode=143 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417562 4747 generic.go:334] "Generic (PLEG): container finished" podID="a52417df-b828-4251-a786-afae5d1aa9fd" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" exitCode=143 Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417582 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417608 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417620 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417626 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417630 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.417750 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418257 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418269 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418274 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418280 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418285 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418289 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418295 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418301 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418306 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418311 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418326 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418332 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418339 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418345 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418351 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418356 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418361 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418366 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418371 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418375 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418382 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418390 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418396 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418401 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418407 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418412 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418417 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418422 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418427 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418431 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418436 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb2vp" event={"ID":"a52417df-b828-4251-a786-afae5d1aa9fd","Type":"ContainerDied","Data":"30df8e79cce6aecfda6809ed6a9210cf4df4b6bda193a518a8aeee17eadc3bec"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418450 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418456 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418463 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418467 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418472 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418477 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418482 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418487 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418492 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.418497 4747 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.449906 4747 scope.go:117] "RemoveContainer" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.467872 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hb2vp"] Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.472258 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hb2vp"] Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.475977 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.486914 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-netns\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.486967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-kubelet\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.486997 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487022 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-config\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-netd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-netns\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487201 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-kubelet\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487072 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rhnv\" (UniqueName: \"kubernetes.io/projected/286e332a-f287-4d77-9843-85fc3700f03e-kube-api-access-7rhnv\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-ovn\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487169 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-netd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487405 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-env-overrides\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-script-lib\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487506 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-var-lib-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487533 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-node-log\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487559 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-bin\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487581 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-node-log\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487637 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-log-socket\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487647 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-cni-bin\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487634 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-var-lib-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487671 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-slash\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487506 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-ovn\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487706 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-log-socket\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487852 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-etc-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487893 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-etc-openvswitch\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487903 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-slash\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.487939 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/286e332a-f287-4d77-9843-85fc3700f03e-ovn-node-metrics-cert\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488012 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-systemd-units\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488046 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488072 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-systemd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488371 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-run-systemd\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488423 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-config\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/286e332a-f287-4d77-9843-85fc3700f03e-systemd-units\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.488886 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-env-overrides\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.489447 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/286e332a-f287-4d77-9843-85fc3700f03e-ovnkube-script-lib\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.492168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/286e332a-f287-4d77-9843-85fc3700f03e-ovn-node-metrics-cert\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.494746 4747 scope.go:117] "RemoveContainer" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.507913 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rhnv\" (UniqueName: \"kubernetes.io/projected/286e332a-f287-4d77-9843-85fc3700f03e-kube-api-access-7rhnv\") pod \"ovnkube-node-ptllq\" (UID: \"286e332a-f287-4d77-9843-85fc3700f03e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.509122 4747 scope.go:117] "RemoveContainer" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.520990 4747 scope.go:117] "RemoveContainer" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.534120 4747 scope.go:117] "RemoveContainer" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.550166 4747 scope.go:117] "RemoveContainer" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.561529 4747 scope.go:117] "RemoveContainer" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.573837 4747 scope.go:117] "RemoveContainer" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.587020 4747 scope.go:117] "RemoveContainer" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.598145 4747 scope.go:117] "RemoveContainer" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.598673 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": container with ID starting with 6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d not found: ID does not exist" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.598716 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} err="failed to get container status \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": rpc error: code = NotFound desc = could not find container \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": container with ID starting with 6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.598751 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.599076 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": container with ID starting with f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f not found: ID does not exist" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599109 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} err="failed to get container status \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": rpc error: code = NotFound desc = could not find container \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": container with ID starting with f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599132 4747 scope.go:117] "RemoveContainer" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.599440 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": container with ID starting with 7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9 not found: ID does not exist" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599474 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} err="failed to get container status \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": rpc error: code = NotFound desc = could not find container \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": container with ID starting with 7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599493 4747 scope.go:117] "RemoveContainer" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.599734 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": container with ID starting with 94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62 not found: ID does not exist" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599756 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} err="failed to get container status \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": rpc error: code = NotFound desc = could not find container \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": container with ID starting with 94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.599770 4747 scope.go:117] "RemoveContainer" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.600006 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": container with ID starting with 262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a not found: ID does not exist" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.600051 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} err="failed to get container status \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": rpc error: code = NotFound desc = could not find container \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": container with ID starting with 262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.600083 4747 scope.go:117] "RemoveContainer" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.600705 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": container with ID starting with e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f not found: ID does not exist" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.600734 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} err="failed to get container status \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": rpc error: code = NotFound desc = could not find container \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": container with ID starting with e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.600750 4747 scope.go:117] "RemoveContainer" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.601062 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": container with ID starting with 76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6 not found: ID does not exist" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.601112 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} err="failed to get container status \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": rpc error: code = NotFound desc = could not find container \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": container with ID starting with 76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.601145 4747 scope.go:117] "RemoveContainer" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.601569 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": container with ID starting with 4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b not found: ID does not exist" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.601594 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} err="failed to get container status \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": rpc error: code = NotFound desc = could not find container \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": container with ID starting with 4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.601609 4747 scope.go:117] "RemoveContainer" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.602060 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": container with ID starting with 0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2 not found: ID does not exist" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602083 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} err="failed to get container status \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": rpc error: code = NotFound desc = could not find container \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": container with ID starting with 0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602103 4747 scope.go:117] "RemoveContainer" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: E1128 13:29:52.602466 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": container with ID starting with 58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540 not found: ID does not exist" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602499 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} err="failed to get container status \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": rpc error: code = NotFound desc = could not find container \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": container with ID starting with 58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602523 4747 scope.go:117] "RemoveContainer" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602887 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} err="failed to get container status \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": rpc error: code = NotFound desc = could not find container \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": container with ID starting with 6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.602922 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.603303 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} err="failed to get container status \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": rpc error: code = NotFound desc = could not find container \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": container with ID starting with f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.603324 4747 scope.go:117] "RemoveContainer" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.603765 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} err="failed to get container status \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": rpc error: code = NotFound desc = could not find container \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": container with ID starting with 7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.603825 4747 scope.go:117] "RemoveContainer" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604137 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} err="failed to get container status \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": rpc error: code = NotFound desc = could not find container \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": container with ID starting with 94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604163 4747 scope.go:117] "RemoveContainer" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604522 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} err="failed to get container status \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": rpc error: code = NotFound desc = could not find container \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": container with ID starting with 262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604556 4747 scope.go:117] "RemoveContainer" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604974 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} err="failed to get container status \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": rpc error: code = NotFound desc = could not find container \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": container with ID starting with e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.604998 4747 scope.go:117] "RemoveContainer" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.605404 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} err="failed to get container status \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": rpc error: code = NotFound desc = could not find container \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": container with ID starting with 76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.605458 4747 scope.go:117] "RemoveContainer" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.605914 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} err="failed to get container status \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": rpc error: code = NotFound desc = could not find container \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": container with ID starting with 4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.605940 4747 scope.go:117] "RemoveContainer" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.606356 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} err="failed to get container status \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": rpc error: code = NotFound desc = could not find container \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": container with ID starting with 0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.606398 4747 scope.go:117] "RemoveContainer" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.606685 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} err="failed to get container status \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": rpc error: code = NotFound desc = could not find container \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": container with ID starting with 58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.606707 4747 scope.go:117] "RemoveContainer" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607002 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} err="failed to get container status \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": rpc error: code = NotFound desc = could not find container \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": container with ID starting with 6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607022 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607319 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} err="failed to get container status \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": rpc error: code = NotFound desc = could not find container \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": container with ID starting with f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607340 4747 scope.go:117] "RemoveContainer" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607610 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} err="failed to get container status \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": rpc error: code = NotFound desc = could not find container \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": container with ID starting with 7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607639 4747 scope.go:117] "RemoveContainer" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607912 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} err="failed to get container status \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": rpc error: code = NotFound desc = could not find container \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": container with ID starting with 94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.607934 4747 scope.go:117] "RemoveContainer" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608183 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} err="failed to get container status \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": rpc error: code = NotFound desc = could not find container \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": container with ID starting with 262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608217 4747 scope.go:117] "RemoveContainer" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608467 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} err="failed to get container status \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": rpc error: code = NotFound desc = could not find container \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": container with ID starting with e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608509 4747 scope.go:117] "RemoveContainer" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608772 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} err="failed to get container status \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": rpc error: code = NotFound desc = could not find container \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": container with ID starting with 76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.608796 4747 scope.go:117] "RemoveContainer" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609026 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} err="failed to get container status \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": rpc error: code = NotFound desc = could not find container \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": container with ID starting with 4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609065 4747 scope.go:117] "RemoveContainer" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609380 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} err="failed to get container status \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": rpc error: code = NotFound desc = could not find container \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": container with ID starting with 0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609403 4747 scope.go:117] "RemoveContainer" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609666 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} err="failed to get container status \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": rpc error: code = NotFound desc = could not find container \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": container with ID starting with 58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609699 4747 scope.go:117] "RemoveContainer" containerID="6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.609978 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d"} err="failed to get container status \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": rpc error: code = NotFound desc = could not find container \"6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d\": container with ID starting with 6cc605c3380a0ed91170230d7c6e5fe761fa1b69ee3bf236391dc6a83460aa7d not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610000 4747 scope.go:117] "RemoveContainer" containerID="f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610307 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f"} err="failed to get container status \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": rpc error: code = NotFound desc = could not find container \"f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f\": container with ID starting with f3b93898b9596e789e1e19e40ebc766bd2328acf9ac3b2569c869c42d333fa6f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610335 4747 scope.go:117] "RemoveContainer" containerID="7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610650 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9"} err="failed to get container status \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": rpc error: code = NotFound desc = could not find container \"7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9\": container with ID starting with 7db7597317d6ae98752a4d75fcd9cdbb65bfa95de2fe3a6def8382bc13a5b4d9 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610671 4747 scope.go:117] "RemoveContainer" containerID="94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610923 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62"} err="failed to get container status \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": rpc error: code = NotFound desc = could not find container \"94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62\": container with ID starting with 94e55d0b27e018647316964e8577eaa45a204cb16d7e41daff8404e21541bb62 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.610951 4747 scope.go:117] "RemoveContainer" containerID="262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611164 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a"} err="failed to get container status \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": rpc error: code = NotFound desc = could not find container \"262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a\": container with ID starting with 262bf154a1a7003e3b281f2e696771feb3ece6aee73bbf3244413d3930cff10a not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611188 4747 scope.go:117] "RemoveContainer" containerID="e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611448 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f"} err="failed to get container status \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": rpc error: code = NotFound desc = could not find container \"e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f\": container with ID starting with e042ac91e26504c977160a0e1f4b42a7c83be8a517a76339fc3dc78faecd173f not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611473 4747 scope.go:117] "RemoveContainer" containerID="76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611711 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6"} err="failed to get container status \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": rpc error: code = NotFound desc = could not find container \"76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6\": container with ID starting with 76daf51a5e135712c08935cb71f11a07386a82068fc49fbd059d143bc3f3d6d6 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.611734 4747 scope.go:117] "RemoveContainer" containerID="4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.612025 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b"} err="failed to get container status \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": rpc error: code = NotFound desc = could not find container \"4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b\": container with ID starting with 4e92f679c621138393415451e26aafd2eab3188dab32b00c246cbeb75295560b not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.612053 4747 scope.go:117] "RemoveContainer" containerID="0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.612322 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2"} err="failed to get container status \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": rpc error: code = NotFound desc = could not find container \"0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2\": container with ID starting with 0954d2e7c3d148f1fd71e32b5fbbc11519a59784b2e94509e610c2108179a6e2 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.612362 4747 scope.go:117] "RemoveContainer" containerID="58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.612824 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540"} err="failed to get container status \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": rpc error: code = NotFound desc = could not find container \"58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540\": container with ID starting with 58320569b6adc845c6b3e7bcb7c863f92e4437f6f5d01a7d7c016e094aca2540 not found: ID does not exist" Nov 28 13:29:52 crc kubenswrapper[4747]: I1128 13:29:52.630458 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:53 crc kubenswrapper[4747]: I1128 13:29:53.426248 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/1.log" Nov 28 13:29:53 crc kubenswrapper[4747]: I1128 13:29:53.429784 4747 generic.go:334] "Generic (PLEG): container finished" podID="286e332a-f287-4d77-9843-85fc3700f03e" containerID="cf00f4453aca0b8eb85ba76dac9d215a04a813ab648ee24a8d94b748ceb656ea" exitCode=0 Nov 28 13:29:53 crc kubenswrapper[4747]: I1128 13:29:53.429825 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerDied","Data":"cf00f4453aca0b8eb85ba76dac9d215a04a813ab648ee24a8d94b748ceb656ea"} Nov 28 13:29:53 crc kubenswrapper[4747]: I1128 13:29:53.429921 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"e67a460ab37a250d879eec13b5accd64b9bfdc7ccc2e585d1077b5b5706098fc"} Nov 28 13:29:53 crc kubenswrapper[4747]: I1128 13:29:53.651992 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52417df-b828-4251-a786-afae5d1aa9fd" path="/var/lib/kubelet/pods/a52417df-b828-4251-a786-afae5d1aa9fd/volumes" Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.455827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"df60ed5267b3aa344e3ca3dcaf658efbab293d845600cf08c09fc2eb7c22e25a"} Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.456790 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"98c2bd95f8acefab7ce3f6e1a3c1a7116c0cec37bb501009a685fb155fa6bda9"} Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.456813 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"f04711b4b8a0fbfc4e399e1d3b98b55a3777f69f2b22124765704ed133a2d97d"} Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.456831 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"03d5d1dab3e216bc058ffa4eb2c57d6693dfa8bb5160f19035a03fff4421611c"} Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.456846 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"553620a01ec02deb181ce8b20381beecc6650ca18e432f964bc8277f61659bc4"} Nov 28 13:29:54 crc kubenswrapper[4747]: I1128 13:29:54.456862 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"89a44d6a2018df942a20139e68e2484a2c11cd76d8e460cb7952dc0047c6ba17"} Nov 28 13:29:57 crc kubenswrapper[4747]: I1128 13:29:57.485357 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"744798c47a731b976555ad140faf26792a2614dbab9d58336625c9836f1d5499"} Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.501941 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" event={"ID":"286e332a-f287-4d77-9843-85fc3700f03e","Type":"ContainerStarted","Data":"bca3fae8cf5a85232c25b84c5dc0b251cd77aa759fafa146264b2a8fe61aa8c6"} Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.502566 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.502589 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.535847 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.538563 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:29:59 crc kubenswrapper[4747]: I1128 13:29:59.553847 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" podStartSLOduration=7.553825024 podStartE2EDuration="7.553825024s" podCreationTimestamp="2025-11-28 13:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:29:59.544567013 +0000 UTC m=+652.207048743" watchObservedRunningTime="2025-11-28 13:29:59.553825024 +0000 UTC m=+652.216306774" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.179914 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45"] Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.180565 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.182689 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.182810 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.191716 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45"] Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.201674 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.201745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.201830 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqr7q\" (UniqueName: \"kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.303468 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqr7q\" (UniqueName: \"kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.303633 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.303682 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.305114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.310014 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.327044 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqr7q\" (UniqueName: \"kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q\") pod \"collect-profiles-29405610-nvs45\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.502069 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: I1128 13:30:00.511696 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:30:00 crc kubenswrapper[4747]: E1128 13:30:00.538060 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(ca97cdf0f70b792d66fdc8437d9fcb5ad29e9fc9d57c999d604eaa53fd52ca62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 28 13:30:00 crc kubenswrapper[4747]: E1128 13:30:00.538192 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(ca97cdf0f70b792d66fdc8437d9fcb5ad29e9fc9d57c999d604eaa53fd52ca62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: E1128 13:30:00.538286 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(ca97cdf0f70b792d66fdc8437d9fcb5ad29e9fc9d57c999d604eaa53fd52ca62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:00 crc kubenswrapper[4747]: E1128 13:30:00.538415 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager(972ab70b-c999-4bc9-99f0-093443386e7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager(972ab70b-c999-4bc9-99f0-093443386e7b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(ca97cdf0f70b792d66fdc8437d9fcb5ad29e9fc9d57c999d604eaa53fd52ca62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" podUID="972ab70b-c999-4bc9-99f0-093443386e7b" Nov 28 13:30:01 crc kubenswrapper[4747]: I1128 13:30:01.516366 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:01 crc kubenswrapper[4747]: I1128 13:30:01.518658 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:01 crc kubenswrapper[4747]: E1128 13:30:01.550223 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(9af8775db3552b7a2c3b082825e156d648b8c594e90549e3f9fbc1884a371ff0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 28 13:30:01 crc kubenswrapper[4747]: E1128 13:30:01.550357 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(9af8775db3552b7a2c3b082825e156d648b8c594e90549e3f9fbc1884a371ff0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:01 crc kubenswrapper[4747]: E1128 13:30:01.550389 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(9af8775db3552b7a2c3b082825e156d648b8c594e90549e3f9fbc1884a371ff0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:01 crc kubenswrapper[4747]: E1128 13:30:01.550463 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager(972ab70b-c999-4bc9-99f0-093443386e7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager(972ab70b-c999-4bc9-99f0-093443386e7b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29405610-nvs45_openshift-operator-lifecycle-manager_972ab70b-c999-4bc9-99f0-093443386e7b_0(9af8775db3552b7a2c3b082825e156d648b8c594e90549e3f9fbc1884a371ff0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" podUID="972ab70b-c999-4bc9-99f0-093443386e7b" Nov 28 13:30:07 crc kubenswrapper[4747]: I1128 13:30:07.649876 4747 scope.go:117] "RemoveContainer" containerID="9da05d5b4fc961afd237c2c6db2f8bc212546df0bbf8cf3b241736b6459df234" Nov 28 13:30:08 crc kubenswrapper[4747]: I1128 13:30:08.565658 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78psz_11d91e3e-309b-4e83-9b0c-1f589c7670f6/kube-multus/1.log" Nov 28 13:30:08 crc kubenswrapper[4747]: I1128 13:30:08.566505 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78psz" event={"ID":"11d91e3e-309b-4e83-9b0c-1f589c7670f6","Type":"ContainerStarted","Data":"18e7df3e03e0a313788dbaadfcefa9becef98a8a1c10f01fda6c04544b88ea90"} Nov 28 13:30:15 crc kubenswrapper[4747]: I1128 13:30:15.641174 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:15 crc kubenswrapper[4747]: I1128 13:30:15.642563 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:15 crc kubenswrapper[4747]: I1128 13:30:15.961181 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45"] Nov 28 13:30:16 crc kubenswrapper[4747]: I1128 13:30:16.629365 4747 generic.go:334] "Generic (PLEG): container finished" podID="972ab70b-c999-4bc9-99f0-093443386e7b" containerID="4caab6f86ea0a5a190cd4487ff14f1840868799066efe3edf1271d79ba6b91bb" exitCode=0 Nov 28 13:30:16 crc kubenswrapper[4747]: I1128 13:30:16.629518 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" event={"ID":"972ab70b-c999-4bc9-99f0-093443386e7b","Type":"ContainerDied","Data":"4caab6f86ea0a5a190cd4487ff14f1840868799066efe3edf1271d79ba6b91bb"} Nov 28 13:30:16 crc kubenswrapper[4747]: I1128 13:30:16.629758 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" event={"ID":"972ab70b-c999-4bc9-99f0-093443386e7b","Type":"ContainerStarted","Data":"288cd823a4a183458cae2fe540e422e1f2fc39af1627a08c128c0c3d90c2bda7"} Nov 28 13:30:17 crc kubenswrapper[4747]: I1128 13:30:17.949800 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.062195 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqr7q\" (UniqueName: \"kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q\") pod \"972ab70b-c999-4bc9-99f0-093443386e7b\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.062307 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume\") pod \"972ab70b-c999-4bc9-99f0-093443386e7b\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.062345 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume\") pod \"972ab70b-c999-4bc9-99f0-093443386e7b\" (UID: \"972ab70b-c999-4bc9-99f0-093443386e7b\") " Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.063193 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume" (OuterVolumeSpecName: "config-volume") pod "972ab70b-c999-4bc9-99f0-093443386e7b" (UID: "972ab70b-c999-4bc9-99f0-093443386e7b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.072292 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "972ab70b-c999-4bc9-99f0-093443386e7b" (UID: "972ab70b-c999-4bc9-99f0-093443386e7b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.075438 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q" (OuterVolumeSpecName: "kube-api-access-pqr7q") pod "972ab70b-c999-4bc9-99f0-093443386e7b" (UID: "972ab70b-c999-4bc9-99f0-093443386e7b"). InnerVolumeSpecName "kube-api-access-pqr7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.164285 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ab70b-c999-4bc9-99f0-093443386e7b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.164343 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqr7q\" (UniqueName: \"kubernetes.io/projected/972ab70b-c999-4bc9-99f0-093443386e7b-kube-api-access-pqr7q\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.164368 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ab70b-c999-4bc9-99f0-093443386e7b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.645000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" event={"ID":"972ab70b-c999-4bc9-99f0-093443386e7b","Type":"ContainerDied","Data":"288cd823a4a183458cae2fe540e422e1f2fc39af1627a08c128c0c3d90c2bda7"} Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.645367 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288cd823a4a183458cae2fe540e422e1f2fc39af1627a08c128c0c3d90c2bda7" Nov 28 13:30:18 crc kubenswrapper[4747]: I1128 13:30:18.645096 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-nvs45" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.054558 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:22 crc kubenswrapper[4747]: E1128 13:30:22.054847 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972ab70b-c999-4bc9-99f0-093443386e7b" containerName="collect-profiles" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.054864 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="972ab70b-c999-4bc9-99f0-093443386e7b" containerName="collect-profiles" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.054987 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="972ab70b-c999-4bc9-99f0-093443386e7b" containerName="collect-profiles" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.055437 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.058167 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-6n8kg" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.058720 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.059694 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.073254 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.225640 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f64d\" (UniqueName: \"kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d\") pod \"mariadb-operator-index-dfbc4\" (UID: \"c2181372-db31-4a2c-8733-927ea9765806\") " pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.327752 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f64d\" (UniqueName: \"kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d\") pod \"mariadb-operator-index-dfbc4\" (UID: \"c2181372-db31-4a2c-8733-927ea9765806\") " pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.365036 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f64d\" (UniqueName: \"kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d\") pod \"mariadb-operator-index-dfbc4\" (UID: \"c2181372-db31-4a2c-8733-927ea9765806\") " pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.401703 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.657852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptllq" Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.828611 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:22 crc kubenswrapper[4747]: W1128 13:30:22.833302 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2181372_db31_4a2c_8733_927ea9765806.slice/crio-d1bb0b24c5a48f8d528982d4bb64a94c34dc49fae44aae724b06447d59717d14 WatchSource:0}: Error finding container d1bb0b24c5a48f8d528982d4bb64a94c34dc49fae44aae724b06447d59717d14: Status 404 returned error can't find the container with id d1bb0b24c5a48f8d528982d4bb64a94c34dc49fae44aae724b06447d59717d14 Nov 28 13:30:22 crc kubenswrapper[4747]: I1128 13:30:22.835774 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:30:23 crc kubenswrapper[4747]: I1128 13:30:23.675638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-dfbc4" event={"ID":"c2181372-db31-4a2c-8733-927ea9765806","Type":"ContainerStarted","Data":"d1bb0b24c5a48f8d528982d4bb64a94c34dc49fae44aae724b06447d59717d14"} Nov 28 13:30:24 crc kubenswrapper[4747]: I1128 13:30:24.802277 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.409473 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.410226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.415479 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.571655 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7nf\" (UniqueName: \"kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf\") pod \"mariadb-operator-index-d94nr\" (UID: \"ea3532b2-7349-4405-8425-5574724d1b9d\") " pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.681482 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7nf\" (UniqueName: \"kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf\") pod \"mariadb-operator-index-d94nr\" (UID: \"ea3532b2-7349-4405-8425-5574724d1b9d\") " pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.712264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7nf\" (UniqueName: \"kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf\") pod \"mariadb-operator-index-d94nr\" (UID: \"ea3532b2-7349-4405-8425-5574724d1b9d\") " pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:25 crc kubenswrapper[4747]: I1128 13:30:25.778833 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:29 crc kubenswrapper[4747]: I1128 13:30:29.712005 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:30:29 crc kubenswrapper[4747]: W1128 13:30:29.722425 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3532b2_7349_4405_8425_5574724d1b9d.slice/crio-028e2e192d4826a56db0637093c4d09feb9258240ccd49da7d702db2b9da73f7 WatchSource:0}: Error finding container 028e2e192d4826a56db0637093c4d09feb9258240ccd49da7d702db2b9da73f7: Status 404 returned error can't find the container with id 028e2e192d4826a56db0637093c4d09feb9258240ccd49da7d702db2b9da73f7 Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.729185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d94nr" event={"ID":"ea3532b2-7349-4405-8425-5574724d1b9d","Type":"ContainerStarted","Data":"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143"} Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.729588 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d94nr" event={"ID":"ea3532b2-7349-4405-8425-5574724d1b9d","Type":"ContainerStarted","Data":"028e2e192d4826a56db0637093c4d09feb9258240ccd49da7d702db2b9da73f7"} Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.731285 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-dfbc4" event={"ID":"c2181372-db31-4a2c-8733-927ea9765806","Type":"ContainerStarted","Data":"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde"} Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.731425 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-dfbc4" podUID="c2181372-db31-4a2c-8733-927ea9765806" containerName="registry-server" containerID="cri-o://a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde" gracePeriod=2 Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.787501 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-dfbc4" podStartSLOduration=2.038036432 podStartE2EDuration="8.787473892s" podCreationTimestamp="2025-11-28 13:30:22 +0000 UTC" firstStartedPulling="2025-11-28 13:30:22.835573961 +0000 UTC m=+675.498055691" lastFinishedPulling="2025-11-28 13:30:29.585011391 +0000 UTC m=+682.247493151" observedRunningTime="2025-11-28 13:30:30.781987325 +0000 UTC m=+683.444469095" watchObservedRunningTime="2025-11-28 13:30:30.787473892 +0000 UTC m=+683.449955652" Nov 28 13:30:30 crc kubenswrapper[4747]: I1128 13:30:30.788377 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-d94nr" podStartSLOduration=5.365891369 podStartE2EDuration="5.788367004s" podCreationTimestamp="2025-11-28 13:30:25 +0000 UTC" firstStartedPulling="2025-11-28 13:30:29.729265981 +0000 UTC m=+682.391747721" lastFinishedPulling="2025-11-28 13:30:30.151741596 +0000 UTC m=+682.814223356" observedRunningTime="2025-11-28 13:30:30.755635337 +0000 UTC m=+683.418117107" watchObservedRunningTime="2025-11-28 13:30:30.788367004 +0000 UTC m=+683.450848764" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.151825 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.266727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f64d\" (UniqueName: \"kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d\") pod \"c2181372-db31-4a2c-8733-927ea9765806\" (UID: \"c2181372-db31-4a2c-8733-927ea9765806\") " Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.273254 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d" (OuterVolumeSpecName: "kube-api-access-8f64d") pod "c2181372-db31-4a2c-8733-927ea9765806" (UID: "c2181372-db31-4a2c-8733-927ea9765806"). InnerVolumeSpecName "kube-api-access-8f64d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.367997 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f64d\" (UniqueName: \"kubernetes.io/projected/c2181372-db31-4a2c-8733-927ea9765806-kube-api-access-8f64d\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.740456 4747 generic.go:334] "Generic (PLEG): container finished" podID="c2181372-db31-4a2c-8733-927ea9765806" containerID="a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde" exitCode=0 Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.740525 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-dfbc4" event={"ID":"c2181372-db31-4a2c-8733-927ea9765806","Type":"ContainerDied","Data":"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde"} Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.740599 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-dfbc4" event={"ID":"c2181372-db31-4a2c-8733-927ea9765806","Type":"ContainerDied","Data":"d1bb0b24c5a48f8d528982d4bb64a94c34dc49fae44aae724b06447d59717d14"} Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.740639 4747 scope.go:117] "RemoveContainer" containerID="a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.740875 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-dfbc4" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.769886 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.777588 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-dfbc4"] Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.777930 4747 scope.go:117] "RemoveContainer" containerID="a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde" Nov 28 13:30:31 crc kubenswrapper[4747]: E1128 13:30:31.778648 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde\": container with ID starting with a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde not found: ID does not exist" containerID="a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde" Nov 28 13:30:31 crc kubenswrapper[4747]: I1128 13:30:31.778885 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde"} err="failed to get container status \"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde\": rpc error: code = NotFound desc = could not find container \"a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde\": container with ID starting with a7fb2350305e7e88a1bb2b1aa27d44a1e66228cdb722bb087868bd9d46169cde not found: ID does not exist" Nov 28 13:30:33 crc kubenswrapper[4747]: I1128 13:30:33.653619 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2181372-db31-4a2c-8733-927ea9765806" path="/var/lib/kubelet/pods/c2181372-db31-4a2c-8733-927ea9765806/volumes" Nov 28 13:30:35 crc kubenswrapper[4747]: I1128 13:30:35.779684 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:35 crc kubenswrapper[4747]: I1128 13:30:35.779756 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:35 crc kubenswrapper[4747]: I1128 13:30:35.818522 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:36 crc kubenswrapper[4747]: I1128 13:30:36.814998 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.318122 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx"] Nov 28 13:30:42 crc kubenswrapper[4747]: E1128 13:30:42.319399 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2181372-db31-4a2c-8733-927ea9765806" containerName="registry-server" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.319430 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2181372-db31-4a2c-8733-927ea9765806" containerName="registry-server" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.319675 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2181372-db31-4a2c-8733-927ea9765806" containerName="registry-server" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.321421 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.324059 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-stc2j" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.334110 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx"] Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.426379 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvn69\" (UniqueName: \"kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.426845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.426962 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.528165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.528340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.528461 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvn69\" (UniqueName: \"kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.528904 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.528949 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.571105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvn69\" (UniqueName: \"kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.647720 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:42 crc kubenswrapper[4747]: I1128 13:30:42.951087 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx"] Nov 28 13:30:43 crc kubenswrapper[4747]: I1128 13:30:43.847447 4747 generic.go:334] "Generic (PLEG): container finished" podID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerID="e275b3e03de128d226d1b08411facc8073c5977d0b5053adbbbf538c92e4e835" exitCode=0 Nov 28 13:30:43 crc kubenswrapper[4747]: I1128 13:30:43.847542 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" event={"ID":"94b9b61c-6235-4ba0-9536-dd4bf65903b5","Type":"ContainerDied","Data":"e275b3e03de128d226d1b08411facc8073c5977d0b5053adbbbf538c92e4e835"} Nov 28 13:30:43 crc kubenswrapper[4747]: I1128 13:30:43.847911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" event={"ID":"94b9b61c-6235-4ba0-9536-dd4bf65903b5","Type":"ContainerStarted","Data":"1f689045f4393b4bdc9c7e1f4899cdc4280c4c31fd00756f0bd6aeb98c7b1efb"} Nov 28 13:30:45 crc kubenswrapper[4747]: I1128 13:30:45.865884 4747 generic.go:334] "Generic (PLEG): container finished" podID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerID="6ac503c482c819ae5435a6f748c00cce9824025147e07bf86ca27fc6746d4d66" exitCode=0 Nov 28 13:30:45 crc kubenswrapper[4747]: I1128 13:30:45.865966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" event={"ID":"94b9b61c-6235-4ba0-9536-dd4bf65903b5","Type":"ContainerDied","Data":"6ac503c482c819ae5435a6f748c00cce9824025147e07bf86ca27fc6746d4d66"} Nov 28 13:30:46 crc kubenswrapper[4747]: I1128 13:30:46.868276 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:30:46 crc kubenswrapper[4747]: I1128 13:30:46.880619 4747 generic.go:334] "Generic (PLEG): container finished" podID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerID="dd9c6ef8beb9b315cf7edd21e213068c5efc6ffcc4995b9a1ba243122de303a3" exitCode=0 Nov 28 13:30:46 crc kubenswrapper[4747]: I1128 13:30:46.880694 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" event={"ID":"94b9b61c-6235-4ba0-9536-dd4bf65903b5","Type":"ContainerDied","Data":"dd9c6ef8beb9b315cf7edd21e213068c5efc6ffcc4995b9a1ba243122de303a3"} Nov 28 13:30:47 crc kubenswrapper[4747]: I1128 13:30:47.633250 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:30:47 crc kubenswrapper[4747]: I1128 13:30:47.633346 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.262707 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.416482 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvn69\" (UniqueName: \"kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69\") pod \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.416615 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle\") pod \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.416668 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util\") pod \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\" (UID: \"94b9b61c-6235-4ba0-9536-dd4bf65903b5\") " Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.418378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle" (OuterVolumeSpecName: "bundle") pod "94b9b61c-6235-4ba0-9536-dd4bf65903b5" (UID: "94b9b61c-6235-4ba0-9536-dd4bf65903b5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.425496 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69" (OuterVolumeSpecName: "kube-api-access-zvn69") pod "94b9b61c-6235-4ba0-9536-dd4bf65903b5" (UID: "94b9b61c-6235-4ba0-9536-dd4bf65903b5"). InnerVolumeSpecName "kube-api-access-zvn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.518480 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvn69\" (UniqueName: \"kubernetes.io/projected/94b9b61c-6235-4ba0-9536-dd4bf65903b5-kube-api-access-zvn69\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.518547 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.631250 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util" (OuterVolumeSpecName: "util") pod "94b9b61c-6235-4ba0-9536-dd4bf65903b5" (UID: "94b9b61c-6235-4ba0-9536-dd4bf65903b5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.721037 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94b9b61c-6235-4ba0-9536-dd4bf65903b5-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.898929 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" event={"ID":"94b9b61c-6235-4ba0-9536-dd4bf65903b5","Type":"ContainerDied","Data":"1f689045f4393b4bdc9c7e1f4899cdc4280c4c31fd00756f0bd6aeb98c7b1efb"} Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.898993 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f689045f4393b4bdc9c7e1f4899cdc4280c4c31fd00756f0bd6aeb98c7b1efb" Nov 28 13:30:48 crc kubenswrapper[4747]: I1128 13:30:48.899063 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.421475 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:30:55 crc kubenswrapper[4747]: E1128 13:30:55.422303 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="pull" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.422320 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="pull" Nov 28 13:30:55 crc kubenswrapper[4747]: E1128 13:30:55.422342 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="util" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.422350 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="util" Nov 28 13:30:55 crc kubenswrapper[4747]: E1128 13:30:55.422367 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="extract" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.422376 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="extract" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.422497 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" containerName="extract" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.422975 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.429642 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.429710 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.429889 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8ldd5" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.437747 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.518164 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.518280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwf8\" (UniqueName: \"kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.518348 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.619601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.619668 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwf8\" (UniqueName: \"kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.619742 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.627801 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.628377 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.635845 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwf8\" (UniqueName: \"kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8\") pod \"mariadb-operator-controller-manager-6d7db75cbb-kfl8b\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:55 crc kubenswrapper[4747]: I1128 13:30:55.751101 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:30:56 crc kubenswrapper[4747]: I1128 13:30:56.054922 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:30:56 crc kubenswrapper[4747]: I1128 13:30:56.945826 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerStarted","Data":"392e0d81152674bfe317ae9cdd63a92f4807933f5f63b5b35b1d5480844ef447"} Nov 28 13:31:01 crc kubenswrapper[4747]: I1128 13:31:01.980151 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerStarted","Data":"67f8ee89b90ec1bc94d6660de3a0d7513231c0e92f0296d258674e7e2b8eb714"} Nov 28 13:31:01 crc kubenswrapper[4747]: I1128 13:31:01.982449 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.008968 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" podStartSLOduration=2.172861177 podStartE2EDuration="7.008938744s" podCreationTimestamp="2025-11-28 13:30:55 +0000 UTC" firstStartedPulling="2025-11-28 13:30:56.062434373 +0000 UTC m=+708.724916103" lastFinishedPulling="2025-11-28 13:31:00.89851194 +0000 UTC m=+713.560993670" observedRunningTime="2025-11-28 13:31:02.005001756 +0000 UTC m=+714.667483516" watchObservedRunningTime="2025-11-28 13:31:02.008938744 +0000 UTC m=+714.671420494" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.313662 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.315298 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.338541 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.420766 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxt7g\" (UniqueName: \"kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.420936 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.420990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.522262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxt7g\" (UniqueName: \"kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.522383 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.522418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.522992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.523256 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.547673 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxt7g\" (UniqueName: \"kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g\") pod \"certified-operators-pf4c5\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.634152 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.907751 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:02 crc kubenswrapper[4747]: I1128 13:31:02.986581 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerStarted","Data":"d12a3b60d9f3975a218bbfb2f6931f0202d4881d9eaeed63e4bcc06a96fce7e4"} Nov 28 13:31:03 crc kubenswrapper[4747]: I1128 13:31:03.992411 4747 generic.go:334] "Generic (PLEG): container finished" podID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerID="83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc" exitCode=0 Nov 28 13:31:03 crc kubenswrapper[4747]: I1128 13:31:03.992459 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerDied","Data":"83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc"} Nov 28 13:31:05 crc kubenswrapper[4747]: I1128 13:31:04.999671 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerStarted","Data":"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883"} Nov 28 13:31:06 crc kubenswrapper[4747]: I1128 13:31:06.009391 4747 generic.go:334] "Generic (PLEG): container finished" podID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerID="376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883" exitCode=0 Nov 28 13:31:06 crc kubenswrapper[4747]: I1128 13:31:06.009442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerDied","Data":"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883"} Nov 28 13:31:07 crc kubenswrapper[4747]: I1128 13:31:07.018051 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerStarted","Data":"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6"} Nov 28 13:31:07 crc kubenswrapper[4747]: I1128 13:31:07.045143 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pf4c5" podStartSLOduration=2.217871851 podStartE2EDuration="5.04512803s" podCreationTimestamp="2025-11-28 13:31:02 +0000 UTC" firstStartedPulling="2025-11-28 13:31:03.995156725 +0000 UTC m=+716.657638445" lastFinishedPulling="2025-11-28 13:31:06.822412894 +0000 UTC m=+719.484894624" observedRunningTime="2025-11-28 13:31:07.04206872 +0000 UTC m=+719.704550450" watchObservedRunningTime="2025-11-28 13:31:07.04512803 +0000 UTC m=+719.707609760" Nov 28 13:31:12 crc kubenswrapper[4747]: I1128 13:31:12.634573 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:12 crc kubenswrapper[4747]: I1128 13:31:12.635616 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:12 crc kubenswrapper[4747]: I1128 13:31:12.694144 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:13 crc kubenswrapper[4747]: I1128 13:31:13.102789 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:15 crc kubenswrapper[4747]: I1128 13:31:15.104504 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:15 crc kubenswrapper[4747]: I1128 13:31:15.756927 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:31:16 crc kubenswrapper[4747]: I1128 13:31:16.076869 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pf4c5" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="registry-server" containerID="cri-o://628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6" gracePeriod=2 Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.036022 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.081754 4747 generic.go:334] "Generic (PLEG): container finished" podID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerID="628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6" exitCode=0 Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.081792 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerDied","Data":"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6"} Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.081803 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pf4c5" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.081818 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pf4c5" event={"ID":"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14","Type":"ContainerDied","Data":"d12a3b60d9f3975a218bbfb2f6931f0202d4881d9eaeed63e4bcc06a96fce7e4"} Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.081835 4747 scope.go:117] "RemoveContainer" containerID="628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.100761 4747 scope.go:117] "RemoveContainer" containerID="376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.122847 4747 scope.go:117] "RemoveContainer" containerID="83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.143193 4747 scope.go:117] "RemoveContainer" containerID="628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6" Nov 28 13:31:17 crc kubenswrapper[4747]: E1128 13:31:17.143647 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6\": container with ID starting with 628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6 not found: ID does not exist" containerID="628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.143701 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6"} err="failed to get container status \"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6\": rpc error: code = NotFound desc = could not find container \"628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6\": container with ID starting with 628fd5b05dee1de1261deb1a895b2cc1521f7800017140047af8a635cbf949c6 not found: ID does not exist" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.143733 4747 scope.go:117] "RemoveContainer" containerID="376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883" Nov 28 13:31:17 crc kubenswrapper[4747]: E1128 13:31:17.144063 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883\": container with ID starting with 376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883 not found: ID does not exist" containerID="376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.144118 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883"} err="failed to get container status \"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883\": rpc error: code = NotFound desc = could not find container \"376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883\": container with ID starting with 376011c360152d523e28db061db78dc2cd2928cdab957b1c474113fd41334883 not found: ID does not exist" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.144154 4747 scope.go:117] "RemoveContainer" containerID="83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc" Nov 28 13:31:17 crc kubenswrapper[4747]: E1128 13:31:17.146920 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc\": container with ID starting with 83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc not found: ID does not exist" containerID="83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.146949 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc"} err="failed to get container status \"83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc\": rpc error: code = NotFound desc = could not find container \"83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc\": container with ID starting with 83f2c03fdf61cc2d1beac275e2cad634c884cd1708603a1425f2c9f386664cbc not found: ID does not exist" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.153388 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities\") pod \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.153506 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content\") pod \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.153549 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxt7g\" (UniqueName: \"kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g\") pod \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\" (UID: \"5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14\") " Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.155433 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities" (OuterVolumeSpecName: "utilities") pod "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" (UID: "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.160392 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g" (OuterVolumeSpecName: "kube-api-access-wxt7g") pod "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" (UID: "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14"). InnerVolumeSpecName "kube-api-access-wxt7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.210759 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" (UID: "5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.254513 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.254551 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxt7g\" (UniqueName: \"kubernetes.io/projected/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-kube-api-access-wxt7g\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.254568 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.410020 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.415033 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pf4c5"] Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.632591 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.632652 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:31:17 crc kubenswrapper[4747]: I1128 13:31:17.647200 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" path="/var/lib/kubelet/pods/5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14/volumes" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.145585 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd"] Nov 28 13:31:21 crc kubenswrapper[4747]: E1128 13:31:21.146132 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="registry-server" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.146146 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="registry-server" Nov 28 13:31:21 crc kubenswrapper[4747]: E1128 13:31:21.146160 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="extract-content" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.146167 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="extract-content" Nov 28 13:31:21 crc kubenswrapper[4747]: E1128 13:31:21.146191 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="extract-utilities" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.146198 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="extract-utilities" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.146366 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2c4e74-a74a-43dc-ab4f-9ff2a7331f14" containerName="registry-server" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.147236 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.150263 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.171409 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd"] Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.203658 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.203894 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.203992 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvgw\" (UniqueName: \"kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.306054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.306165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.306252 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvgw\" (UniqueName: \"kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.306572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.306797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.325092 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvgw\" (UniqueName: \"kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.472898 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:21 crc kubenswrapper[4747]: I1128 13:31:21.926253 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd"] Nov 28 13:31:22 crc kubenswrapper[4747]: I1128 13:31:22.118699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerStarted","Data":"c0fe911e196e825da27a23e0a0dd013e184845137eeac472bbd682524d710dfc"} Nov 28 13:31:22 crc kubenswrapper[4747]: I1128 13:31:22.119017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerStarted","Data":"6ca3cfef76351f0b98b4886e9365312c4b30d2a69cf2dd197f76f2298830badb"} Nov 28 13:31:23 crc kubenswrapper[4747]: I1128 13:31:23.126712 4747 generic.go:334] "Generic (PLEG): container finished" podID="15555506-9bd3-401c-b26e-52cd441c0663" containerID="c0fe911e196e825da27a23e0a0dd013e184845137eeac472bbd682524d710dfc" exitCode=0 Nov 28 13:31:23 crc kubenswrapper[4747]: I1128 13:31:23.126904 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerDied","Data":"c0fe911e196e825da27a23e0a0dd013e184845137eeac472bbd682524d710dfc"} Nov 28 13:31:24 crc kubenswrapper[4747]: I1128 13:31:24.922767 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:24 crc kubenswrapper[4747]: I1128 13:31:24.925434 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:24 crc kubenswrapper[4747]: I1128 13:31:24.938228 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.063606 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.063724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wss7l\" (UniqueName: \"kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.064071 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.141914 4747 generic.go:334] "Generic (PLEG): container finished" podID="15555506-9bd3-401c-b26e-52cd441c0663" containerID="6e025c2f8c4a929cb22f950a772eaae275ac15aa4c9fe69852543a9214770900" exitCode=0 Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.141959 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerDied","Data":"6e025c2f8c4a929cb22f950a772eaae275ac15aa4c9fe69852543a9214770900"} Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.166155 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.166251 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wss7l\" (UniqueName: \"kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.166305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.166946 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.167352 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.191890 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wss7l\" (UniqueName: \"kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l\") pod \"redhat-operators-b64z4\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.258327 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:25 crc kubenswrapper[4747]: I1128 13:31:25.447420 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:25 crc kubenswrapper[4747]: W1128 13:31:25.451961 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa74d52_c7a3_48da_ab38_4930c6eb135b.slice/crio-15b1d6ad7abe1a9ce9158346e4587f38f7fb13b6f7047f09536cdd55b1fa146a WatchSource:0}: Error finding container 15b1d6ad7abe1a9ce9158346e4587f38f7fb13b6f7047f09536cdd55b1fa146a: Status 404 returned error can't find the container with id 15b1d6ad7abe1a9ce9158346e4587f38f7fb13b6f7047f09536cdd55b1fa146a Nov 28 13:31:26 crc kubenswrapper[4747]: I1128 13:31:26.148474 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerStarted","Data":"15b1d6ad7abe1a9ce9158346e4587f38f7fb13b6f7047f09536cdd55b1fa146a"} Nov 28 13:31:27 crc kubenswrapper[4747]: I1128 13:31:27.155335 4747 generic.go:334] "Generic (PLEG): container finished" podID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerID="e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631" exitCode=0 Nov 28 13:31:27 crc kubenswrapper[4747]: I1128 13:31:27.155845 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerDied","Data":"e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631"} Nov 28 13:31:27 crc kubenswrapper[4747]: I1128 13:31:27.160232 4747 generic.go:334] "Generic (PLEG): container finished" podID="15555506-9bd3-401c-b26e-52cd441c0663" containerID="6269423819a39bf6fbeba247c1c8054f5f2712b3520443a652217b3597d8b176" exitCode=0 Nov 28 13:31:27 crc kubenswrapper[4747]: I1128 13:31:27.160338 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerDied","Data":"6269423819a39bf6fbeba247c1c8054f5f2712b3520443a652217b3597d8b176"} Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.167548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerStarted","Data":"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7"} Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.482329 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.625596 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util\") pod \"15555506-9bd3-401c-b26e-52cd441c0663\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.626179 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle\") pod \"15555506-9bd3-401c-b26e-52cd441c0663\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.626366 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvgw\" (UniqueName: \"kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw\") pod \"15555506-9bd3-401c-b26e-52cd441c0663\" (UID: \"15555506-9bd3-401c-b26e-52cd441c0663\") " Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.627744 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle" (OuterVolumeSpecName: "bundle") pod "15555506-9bd3-401c-b26e-52cd441c0663" (UID: "15555506-9bd3-401c-b26e-52cd441c0663"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.635581 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw" (OuterVolumeSpecName: "kube-api-access-xpvgw") pod "15555506-9bd3-401c-b26e-52cd441c0663" (UID: "15555506-9bd3-401c-b26e-52cd441c0663"). InnerVolumeSpecName "kube-api-access-xpvgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.648077 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util" (OuterVolumeSpecName: "util") pod "15555506-9bd3-401c-b26e-52cd441c0663" (UID: "15555506-9bd3-401c-b26e-52cd441c0663"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.728618 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.728685 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15555506-9bd3-401c-b26e-52cd441c0663-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:28 crc kubenswrapper[4747]: I1128 13:31:28.728713 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvgw\" (UniqueName: \"kubernetes.io/projected/15555506-9bd3-401c-b26e-52cd441c0663-kube-api-access-xpvgw\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:29 crc kubenswrapper[4747]: I1128 13:31:29.178358 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" Nov 28 13:31:29 crc kubenswrapper[4747]: I1128 13:31:29.178331 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd" event={"ID":"15555506-9bd3-401c-b26e-52cd441c0663","Type":"ContainerDied","Data":"6ca3cfef76351f0b98b4886e9365312c4b30d2a69cf2dd197f76f2298830badb"} Nov 28 13:31:29 crc kubenswrapper[4747]: I1128 13:31:29.178814 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ca3cfef76351f0b98b4886e9365312c4b30d2a69cf2dd197f76f2298830badb" Nov 28 13:31:29 crc kubenswrapper[4747]: I1128 13:31:29.182199 4747 generic.go:334] "Generic (PLEG): container finished" podID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerID="ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7" exitCode=0 Nov 28 13:31:29 crc kubenswrapper[4747]: I1128 13:31:29.182269 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerDied","Data":"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7"} Nov 28 13:31:30 crc kubenswrapper[4747]: I1128 13:31:30.195800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerStarted","Data":"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5"} Nov 28 13:31:30 crc kubenswrapper[4747]: I1128 13:31:30.224700 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b64z4" podStartSLOduration=3.416056313 podStartE2EDuration="6.224670679s" podCreationTimestamp="2025-11-28 13:31:24 +0000 UTC" firstStartedPulling="2025-11-28 13:31:27.15703543 +0000 UTC m=+739.819517160" lastFinishedPulling="2025-11-28 13:31:29.965649756 +0000 UTC m=+742.628131526" observedRunningTime="2025-11-28 13:31:30.222644623 +0000 UTC m=+742.885126393" watchObservedRunningTime="2025-11-28 13:31:30.224670679 +0000 UTC m=+742.887152449" Nov 28 13:31:35 crc kubenswrapper[4747]: I1128 13:31:35.259102 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:35 crc kubenswrapper[4747]: I1128 13:31:35.259500 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:36 crc kubenswrapper[4747]: I1128 13:31:36.300353 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b64z4" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="registry-server" probeResult="failure" output=< Nov 28 13:31:36 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Nov 28 13:31:36 crc kubenswrapper[4747]: > Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.898925 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7456584b94-v95js"] Nov 28 13:31:38 crc kubenswrapper[4747]: E1128 13:31:38.899504 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="extract" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.899519 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="extract" Nov 28 13:31:38 crc kubenswrapper[4747]: E1128 13:31:38.899531 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="util" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.899537 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="util" Nov 28 13:31:38 crc kubenswrapper[4747]: E1128 13:31:38.899554 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="pull" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.899559 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="pull" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.899689 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="15555506-9bd3-401c-b26e-52cd441c0663" containerName="extract" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.900099 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.905014 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-q6q8g" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.905650 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.905852 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.906955 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.908364 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.925790 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7456584b94-v95js"] Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.967151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6964\" (UniqueName: \"kubernetes.io/projected/2f574e4d-330e-47db-85f9-48558244cccb-kube-api-access-t6964\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.967201 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-webhook-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:38 crc kubenswrapper[4747]: I1128 13:31:38.967262 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-apiservice-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.068406 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6964\" (UniqueName: \"kubernetes.io/projected/2f574e4d-330e-47db-85f9-48558244cccb-kube-api-access-t6964\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.068467 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-webhook-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.068528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-apiservice-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.076403 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-webhook-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.076882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f574e4d-330e-47db-85f9-48558244cccb-apiservice-cert\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.102385 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6964\" (UniqueName: \"kubernetes.io/projected/2f574e4d-330e-47db-85f9-48558244cccb-kube-api-access-t6964\") pod \"metallb-operator-controller-manager-7456584b94-v95js\" (UID: \"2f574e4d-330e-47db-85f9-48558244cccb\") " pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.217746 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.345358 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw"] Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.346449 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.348223 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.348967 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.349105 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qr8dq" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.360618 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw"] Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.472662 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-apiservice-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.472702 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqtlx\" (UniqueName: \"kubernetes.io/projected/57bc32d2-9f9f-4c31-9fdc-79027059691e-kube-api-access-bqtlx\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.472724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-webhook-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.574018 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-apiservice-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.574068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqtlx\" (UniqueName: \"kubernetes.io/projected/57bc32d2-9f9f-4c31-9fdc-79027059691e-kube-api-access-bqtlx\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.574103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-webhook-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.579609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-webhook-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.581391 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57bc32d2-9f9f-4c31-9fdc-79027059691e-apiservice-cert\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.589476 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqtlx\" (UniqueName: \"kubernetes.io/projected/57bc32d2-9f9f-4c31-9fdc-79027059691e-kube-api-access-bqtlx\") pod \"metallb-operator-webhook-server-58f86dcff-55hdw\" (UID: \"57bc32d2-9f9f-4c31-9fdc-79027059691e\") " pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.689834 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:39 crc kubenswrapper[4747]: I1128 13:31:39.690926 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7456584b94-v95js"] Nov 28 13:31:40 crc kubenswrapper[4747]: I1128 13:31:40.095513 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw"] Nov 28 13:31:40 crc kubenswrapper[4747]: W1128 13:31:40.100927 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57bc32d2_9f9f_4c31_9fdc_79027059691e.slice/crio-4ec638ef42905eab23273f33fcd7ca35717fa1fd41ba61a5d9c2430a46a5c780 WatchSource:0}: Error finding container 4ec638ef42905eab23273f33fcd7ca35717fa1fd41ba61a5d9c2430a46a5c780: Status 404 returned error can't find the container with id 4ec638ef42905eab23273f33fcd7ca35717fa1fd41ba61a5d9c2430a46a5c780 Nov 28 13:31:40 crc kubenswrapper[4747]: I1128 13:31:40.253994 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" event={"ID":"57bc32d2-9f9f-4c31-9fdc-79027059691e","Type":"ContainerStarted","Data":"4ec638ef42905eab23273f33fcd7ca35717fa1fd41ba61a5d9c2430a46a5c780"} Nov 28 13:31:40 crc kubenswrapper[4747]: I1128 13:31:40.255034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" event={"ID":"2f574e4d-330e-47db-85f9-48558244cccb","Type":"ContainerStarted","Data":"2dd6b6a8eac9d354e02ee6464aa68dfb9719e468f6f193fe4196308ccb2ab7d6"} Nov 28 13:31:44 crc kubenswrapper[4747]: I1128 13:31:44.286411 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" event={"ID":"2f574e4d-330e-47db-85f9-48558244cccb","Type":"ContainerStarted","Data":"3a63c5548c39ac9173f6232d7d1b4813c897dc67849d8ec2c4e98ffeb0401d01"} Nov 28 13:31:44 crc kubenswrapper[4747]: I1128 13:31:44.286901 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:31:44 crc kubenswrapper[4747]: I1128 13:31:44.313463 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" podStartSLOduration=2.847578373 podStartE2EDuration="6.313443888s" podCreationTimestamp="2025-11-28 13:31:38 +0000 UTC" firstStartedPulling="2025-11-28 13:31:39.704184177 +0000 UTC m=+752.366665907" lastFinishedPulling="2025-11-28 13:31:43.170049692 +0000 UTC m=+755.832531422" observedRunningTime="2025-11-28 13:31:44.305889706 +0000 UTC m=+756.968371446" watchObservedRunningTime="2025-11-28 13:31:44.313443888 +0000 UTC m=+756.975925618" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.308709 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" event={"ID":"57bc32d2-9f9f-4c31-9fdc-79027059691e","Type":"ContainerStarted","Data":"438f9f70c6cd65ca13b408a7cc412057629c83a2a5cbb81db87f1c6150e5cb17"} Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.314786 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.341641 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" podStartSLOduration=1.8792830459999998 podStartE2EDuration="6.341621969s" podCreationTimestamp="2025-11-28 13:31:39 +0000 UTC" firstStartedPulling="2025-11-28 13:31:40.108755877 +0000 UTC m=+752.771237607" lastFinishedPulling="2025-11-28 13:31:44.5710948 +0000 UTC m=+757.233576530" observedRunningTime="2025-11-28 13:31:45.334966928 +0000 UTC m=+757.997448668" watchObservedRunningTime="2025-11-28 13:31:45.341621969 +0000 UTC m=+758.004103709" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.349678 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.353292 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.353326 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.385266 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.433329 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.486455 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.486850 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65l8\" (UniqueName: \"kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.487013 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.588155 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.588915 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65l8\" (UniqueName: \"kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.589309 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.588854 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.589605 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.606996 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65l8\" (UniqueName: \"kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8\") pod \"community-operators-qldx4\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:45 crc kubenswrapper[4747]: I1128 13:31:45.698373 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:46 crc kubenswrapper[4747]: I1128 13:31:46.290173 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:31:46 crc kubenswrapper[4747]: W1128 13:31:46.307263 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e941a58_d4e6_4bb8_b46a_b0f288f656bd.slice/crio-34359af446d3b05c886f9eaa8f212f755e744b7ee300945aea1f64e6b00527ed WatchSource:0}: Error finding container 34359af446d3b05c886f9eaa8f212f755e744b7ee300945aea1f64e6b00527ed: Status 404 returned error can't find the container with id 34359af446d3b05c886f9eaa8f212f755e744b7ee300945aea1f64e6b00527ed Nov 28 13:31:46 crc kubenswrapper[4747]: I1128 13:31:46.316523 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerStarted","Data":"34359af446d3b05c886f9eaa8f212f755e744b7ee300945aea1f64e6b00527ed"} Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.324620 4747 generic.go:334] "Generic (PLEG): container finished" podID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerID="dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d" exitCode=0 Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.324712 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerDied","Data":"dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d"} Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.632974 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.633061 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.633114 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.633761 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.633838 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d" gracePeriod=600 Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.702710 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:47 crc kubenswrapper[4747]: I1128 13:31:47.703125 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b64z4" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="registry-server" containerID="cri-o://f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5" gracePeriod=2 Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.153613 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.227283 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content\") pod \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.227412 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wss7l\" (UniqueName: \"kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l\") pod \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.227447 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities\") pod \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\" (UID: \"9aa74d52-c7a3-48da-ab38-4930c6eb135b\") " Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.228583 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities" (OuterVolumeSpecName: "utilities") pod "9aa74d52-c7a3-48da-ab38-4930c6eb135b" (UID: "9aa74d52-c7a3-48da-ab38-4930c6eb135b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.233769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l" (OuterVolumeSpecName: "kube-api-access-wss7l") pod "9aa74d52-c7a3-48da-ab38-4930c6eb135b" (UID: "9aa74d52-c7a3-48da-ab38-4930c6eb135b"). InnerVolumeSpecName "kube-api-access-wss7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.329624 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wss7l\" (UniqueName: \"kubernetes.io/projected/9aa74d52-c7a3-48da-ab38-4930c6eb135b-kube-api-access-wss7l\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.329928 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.332138 4747 generic.go:334] "Generic (PLEG): container finished" podID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerID="f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5" exitCode=0 Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.332200 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerDied","Data":"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5"} Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.332240 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b64z4" event={"ID":"9aa74d52-c7a3-48da-ab38-4930c6eb135b","Type":"ContainerDied","Data":"15b1d6ad7abe1a9ce9158346e4587f38f7fb13b6f7047f09536cdd55b1fa146a"} Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.332257 4747 scope.go:117] "RemoveContainer" containerID="f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.332379 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b64z4" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.337928 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerStarted","Data":"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01"} Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.342986 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d" exitCode=0 Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.343019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d"} Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.343048 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb"} Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.351574 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aa74d52-c7a3-48da-ab38-4930c6eb135b" (UID: "9aa74d52-c7a3-48da-ab38-4930c6eb135b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.356676 4747 scope.go:117] "RemoveContainer" containerID="ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.378270 4747 scope.go:117] "RemoveContainer" containerID="e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.395887 4747 scope.go:117] "RemoveContainer" containerID="f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5" Nov 28 13:31:48 crc kubenswrapper[4747]: E1128 13:31:48.398663 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5\": container with ID starting with f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5 not found: ID does not exist" containerID="f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.398700 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5"} err="failed to get container status \"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5\": rpc error: code = NotFound desc = could not find container \"f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5\": container with ID starting with f3044607aa2bdf3130820b61162615b3ae1fafb3a48f173d4fd09b683238c5a5 not found: ID does not exist" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.398736 4747 scope.go:117] "RemoveContainer" containerID="ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7" Nov 28 13:31:48 crc kubenswrapper[4747]: E1128 13:31:48.399065 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7\": container with ID starting with ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7 not found: ID does not exist" containerID="ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.399091 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7"} err="failed to get container status \"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7\": rpc error: code = NotFound desc = could not find container \"ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7\": container with ID starting with ebff3bb8d65c8a9326925ffd1ad423261df130b3d88fa0078d60e2e388478df7 not found: ID does not exist" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.399107 4747 scope.go:117] "RemoveContainer" containerID="e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631" Nov 28 13:31:48 crc kubenswrapper[4747]: E1128 13:31:48.399346 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631\": container with ID starting with e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631 not found: ID does not exist" containerID="e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.399374 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631"} err="failed to get container status \"e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631\": rpc error: code = NotFound desc = could not find container \"e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631\": container with ID starting with e0efa8ed3bfd1f5a21bd8b0ff3d4c289fd63908b8091c8403c4d8601e5f15631 not found: ID does not exist" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.399391 4747 scope.go:117] "RemoveContainer" containerID="ac7815a74fe47c8f9aa22920131c6733c5e8d6e71cf9eb9ebc2aac1920209e1f" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.431897 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa74d52-c7a3-48da-ab38-4930c6eb135b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.675033 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:48 crc kubenswrapper[4747]: I1128 13:31:48.680878 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b64z4"] Nov 28 13:31:49 crc kubenswrapper[4747]: I1128 13:31:49.359152 4747 generic.go:334] "Generic (PLEG): container finished" podID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerID="25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01" exitCode=0 Nov 28 13:31:49 crc kubenswrapper[4747]: I1128 13:31:49.359232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerDied","Data":"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01"} Nov 28 13:31:49 crc kubenswrapper[4747]: I1128 13:31:49.653813 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" path="/var/lib/kubelet/pods/9aa74d52-c7a3-48da-ab38-4930c6eb135b/volumes" Nov 28 13:31:51 crc kubenswrapper[4747]: I1128 13:31:51.379977 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerStarted","Data":"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392"} Nov 28 13:31:51 crc kubenswrapper[4747]: I1128 13:31:51.402900 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qldx4" podStartSLOduration=3.408887818 podStartE2EDuration="6.402878779s" podCreationTimestamp="2025-11-28 13:31:45 +0000 UTC" firstStartedPulling="2025-11-28 13:31:47.326262877 +0000 UTC m=+759.988744617" lastFinishedPulling="2025-11-28 13:31:50.320253848 +0000 UTC m=+762.982735578" observedRunningTime="2025-11-28 13:31:51.401985389 +0000 UTC m=+764.064467119" watchObservedRunningTime="2025-11-28 13:31:51.402878779 +0000 UTC m=+764.065360519" Nov 28 13:31:55 crc kubenswrapper[4747]: I1128 13:31:55.699436 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:55 crc kubenswrapper[4747]: I1128 13:31:55.700386 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:55 crc kubenswrapper[4747]: I1128 13:31:55.753008 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:56 crc kubenswrapper[4747]: I1128 13:31:56.449326 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:31:58 crc kubenswrapper[4747]: I1128 13:31:58.102970 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:31:59 crc kubenswrapper[4747]: I1128 13:31:59.428913 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qldx4" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="registry-server" containerID="cri-o://eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392" gracePeriod=2 Nov 28 13:31:59 crc kubenswrapper[4747]: I1128 13:31:59.704366 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-58f86dcff-55hdw" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.349229 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.387468 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content\") pod \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.387595 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65l8\" (UniqueName: \"kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8\") pod \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.388529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities\") pod \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\" (UID: \"2e941a58-d4e6-4bb8-b46a-b0f288f656bd\") " Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.389422 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities" (OuterVolumeSpecName: "utilities") pod "2e941a58-d4e6-4bb8-b46a-b0f288f656bd" (UID: "2e941a58-d4e6-4bb8-b46a-b0f288f656bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.401361 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8" (OuterVolumeSpecName: "kube-api-access-c65l8") pod "2e941a58-d4e6-4bb8-b46a-b0f288f656bd" (UID: "2e941a58-d4e6-4bb8-b46a-b0f288f656bd"). InnerVolumeSpecName "kube-api-access-c65l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.444455 4747 generic.go:334] "Generic (PLEG): container finished" podID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerID="eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392" exitCode=0 Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.444573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerDied","Data":"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392"} Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.444645 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qldx4" event={"ID":"2e941a58-d4e6-4bb8-b46a-b0f288f656bd","Type":"ContainerDied","Data":"34359af446d3b05c886f9eaa8f212f755e744b7ee300945aea1f64e6b00527ed"} Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.444669 4747 scope.go:117] "RemoveContainer" containerID="eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.444882 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qldx4" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.451532 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e941a58-d4e6-4bb8-b46a-b0f288f656bd" (UID: "2e941a58-d4e6-4bb8-b46a-b0f288f656bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.466465 4747 scope.go:117] "RemoveContainer" containerID="25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.489794 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65l8\" (UniqueName: \"kubernetes.io/projected/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-kube-api-access-c65l8\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.489830 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.489841 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e941a58-d4e6-4bb8-b46a-b0f288f656bd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.493737 4747 scope.go:117] "RemoveContainer" containerID="dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.510510 4747 scope.go:117] "RemoveContainer" containerID="eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392" Nov 28 13:32:00 crc kubenswrapper[4747]: E1128 13:32:00.510977 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392\": container with ID starting with eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392 not found: ID does not exist" containerID="eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.511019 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392"} err="failed to get container status \"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392\": rpc error: code = NotFound desc = could not find container \"eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392\": container with ID starting with eae145fa65d92bd42264eaac9a6edc1a9a3d67aa3ac0bc2db51940f21be31392 not found: ID does not exist" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.511048 4747 scope.go:117] "RemoveContainer" containerID="25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01" Nov 28 13:32:00 crc kubenswrapper[4747]: E1128 13:32:00.511567 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01\": container with ID starting with 25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01 not found: ID does not exist" containerID="25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.511645 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01"} err="failed to get container status \"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01\": rpc error: code = NotFound desc = could not find container \"25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01\": container with ID starting with 25ec15b6a0f3b13edee9f1efdaf52a093a9860d2048245177c62e954b6f49b01 not found: ID does not exist" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.511687 4747 scope.go:117] "RemoveContainer" containerID="dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d" Nov 28 13:32:00 crc kubenswrapper[4747]: E1128 13:32:00.512082 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d\": container with ID starting with dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d not found: ID does not exist" containerID="dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.512131 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d"} err="failed to get container status \"dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d\": rpc error: code = NotFound desc = could not find container \"dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d\": container with ID starting with dc396f780a16606d4e3b7112dc97ec035a8506fbdb3b90e9fc4124921323103d not found: ID does not exist" Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.787815 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:32:00 crc kubenswrapper[4747]: I1128 13:32:00.794790 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qldx4"] Nov 28 13:32:01 crc kubenswrapper[4747]: I1128 13:32:01.656898 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" path="/var/lib/kubelet/pods/2e941a58-d4e6-4bb8-b46a-b0f288f656bd/volumes" Nov 28 13:32:19 crc kubenswrapper[4747]: I1128 13:32:19.221469 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7456584b94-v95js" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019258 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv"] Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019867 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019889 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019902 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="extract-content" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019909 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="extract-content" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019920 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019927 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019935 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="extract-content" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019942 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="extract-content" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="extract-utilities" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019963 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="extract-utilities" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.019975 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="extract-utilities" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.019982 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="extract-utilities" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.020093 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e941a58-d4e6-4bb8-b46a-b0f288f656bd" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.020113 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa74d52-c7a3-48da-ab38-4930c6eb135b" containerName="registry-server" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.020580 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.022586 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4ltzp" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.022850 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.033152 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv"] Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.036444 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xvnqh"] Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.041564 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.047815 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.048104 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.080818 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wkv\" (UniqueName: \"kubernetes.io/projected/2d74a4af-60f4-4a8e-9778-3be8fe163205-kube-api-access-q9wkv\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.080872 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.080906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97srt\" (UniqueName: \"kubernetes.io/projected/1ad103ad-636e-440b-924d-7b59aa875aa4-kube-api-access-97srt\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.080934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-startup\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.080993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-reloader\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.081006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-conf\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.081044 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.081117 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-sockets\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.081168 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.116741 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mt84d"] Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.117878 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.120649 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.120723 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.120740 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dd2ql" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.121395 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.134361 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-pqb5v"] Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.135284 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.138570 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.151133 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-pqb5v"] Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.182829 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-sockets\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.182912 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.182951 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rth6\" (UniqueName: \"kubernetes.io/projected/108c08d8-4320-4227-ae03-933609bda4c0-kube-api-access-9rth6\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.182985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wkv\" (UniqueName: \"kubernetes.io/projected/2d74a4af-60f4-4a8e-9778-3be8fe163205-kube-api-access-q9wkv\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183009 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97srt\" (UniqueName: \"kubernetes.io/projected/1ad103ad-636e-440b-924d-7b59aa875aa4-kube-api-access-97srt\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-startup\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-conf\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183128 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-reloader\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183165 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-cert\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183186 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183228 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183254 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183272 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183300 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/108c08d8-4320-4227-ae03-933609bda4c0-metallb-excludel2\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nqb\" (UniqueName: \"kubernetes.io/projected/534794d1-ed1e-4a3e-a094-ae6acb566bdc-kube-api-access-27nqb\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.183922 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-sockets\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.183934 4747 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.184004 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs podName:1ad103ad-636e-440b-924d-7b59aa875aa4 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:20.683984993 +0000 UTC m=+793.346466723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs") pod "frr-k8s-xvnqh" (UID: "1ad103ad-636e-440b-924d-7b59aa875aa4") : secret "frr-k8s-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.184141 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-conf\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.184242 4747 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.184283 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert podName:2d74a4af-60f4-4a8e-9778-3be8fe163205 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:20.6842728 +0000 UTC m=+793.346754530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert") pod "frr-k8s-webhook-server-7fcb986d4-r7gtv" (UID: "2d74a4af-60f4-4a8e-9778-3be8fe163205") : secret "frr-k8s-webhook-server-cert" not found Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.184382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.184498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ad103ad-636e-440b-924d-7b59aa875aa4-reloader\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.185278 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ad103ad-636e-440b-924d-7b59aa875aa4-frr-startup\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.228054 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wkv\" (UniqueName: \"kubernetes.io/projected/2d74a4af-60f4-4a8e-9778-3be8fe163205-kube-api-access-q9wkv\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.239815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97srt\" (UniqueName: \"kubernetes.io/projected/1ad103ad-636e-440b-924d-7b59aa875aa4-kube-api-access-97srt\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-cert\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284849 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284864 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/108c08d8-4320-4227-ae03-933609bda4c0-metallb-excludel2\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nqb\" (UniqueName: \"kubernetes.io/projected/534794d1-ed1e-4a3e-a094-ae6acb566bdc-kube-api-access-27nqb\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.284945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rth6\" (UniqueName: \"kubernetes.io/projected/108c08d8-4320-4227-ae03-933609bda4c0-kube-api-access-9rth6\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.284971 4747 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.285028 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs podName:534794d1-ed1e-4a3e-a094-ae6acb566bdc nodeName:}" failed. No retries permitted until 2025-11-28 13:32:20.785009696 +0000 UTC m=+793.447491426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs") pod "controller-f8648f98b-pqb5v" (UID: "534794d1-ed1e-4a3e-a094-ae6acb566bdc") : secret "controller-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.285069 4747 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.285157 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist podName:108c08d8-4320-4227-ae03-933609bda4c0 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:20.785133618 +0000 UTC m=+793.447615348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist") pod "speaker-mt84d" (UID: "108c08d8-4320-4227-ae03-933609bda4c0") : secret "metallb-memberlist" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.285172 4747 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.285220 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs podName:108c08d8-4320-4227-ae03-933609bda4c0 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:20.78519471 +0000 UTC m=+793.447676570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs") pod "speaker-mt84d" (UID: "108c08d8-4320-4227-ae03-933609bda4c0") : secret "speaker-certs-secret" not found Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.285717 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/108c08d8-4320-4227-ae03-933609bda4c0-metallb-excludel2\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.287660 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.304949 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-cert\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.309854 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nqb\" (UniqueName: \"kubernetes.io/projected/534794d1-ed1e-4a3e-a094-ae6acb566bdc-kube-api-access-27nqb\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.314742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rth6\" (UniqueName: \"kubernetes.io/projected/108c08d8-4320-4227-ae03-933609bda4c0-kube-api-access-9rth6\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.690513 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.690589 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.696280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ad103ad-636e-440b-924d-7b59aa875aa4-metrics-certs\") pod \"frr-k8s-xvnqh\" (UID: \"1ad103ad-636e-440b-924d-7b59aa875aa4\") " pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.700353 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d74a4af-60f4-4a8e-9778-3be8fe163205-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r7gtv\" (UID: \"2d74a4af-60f4-4a8e-9778-3be8fe163205\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.792672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.792837 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.792898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.792948 4747 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 28 13:32:20 crc kubenswrapper[4747]: E1128 13:32:20.793076 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist podName:108c08d8-4320-4227-ae03-933609bda4c0 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:21.793047993 +0000 UTC m=+794.455529763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist") pod "speaker-mt84d" (UID: "108c08d8-4320-4227-ae03-933609bda4c0") : secret "metallb-memberlist" not found Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.798254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-metrics-certs\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.799006 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/534794d1-ed1e-4a3e-a094-ae6acb566bdc-metrics-certs\") pod \"controller-f8648f98b-pqb5v\" (UID: \"534794d1-ed1e-4a3e-a094-ae6acb566bdc\") " pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.942547 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:20 crc kubenswrapper[4747]: I1128 13:32:20.960721 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.059901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.177665 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv"] Nov 28 13:32:21 crc kubenswrapper[4747]: W1128 13:32:21.188070 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d74a4af_60f4_4a8e_9778_3be8fe163205.slice/crio-cecde3d38e23ec24561abb06e8c88fbe0c956b2717024e47aa5d2d340bf92e19 WatchSource:0}: Error finding container cecde3d38e23ec24561abb06e8c88fbe0c956b2717024e47aa5d2d340bf92e19: Status 404 returned error can't find the container with id cecde3d38e23ec24561abb06e8c88fbe0c956b2717024e47aa5d2d340bf92e19 Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.267043 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-pqb5v"] Nov 28 13:32:21 crc kubenswrapper[4747]: W1128 13:32:21.275560 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod534794d1_ed1e_4a3e_a094_ae6acb566bdc.slice/crio-5ce8f2800edbe20213c25efd408264a776f625f2035fde11ea888503423c796c WatchSource:0}: Error finding container 5ce8f2800edbe20213c25efd408264a776f625f2035fde11ea888503423c796c: Status 404 returned error can't find the container with id 5ce8f2800edbe20213c25efd408264a776f625f2035fde11ea888503423c796c Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.583291 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pqb5v" event={"ID":"534794d1-ed1e-4a3e-a094-ae6acb566bdc","Type":"ContainerStarted","Data":"f471aef26827f311ecb6c05820c4acf1a921a0819914549aeaedc391609c8db4"} Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.583691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pqb5v" event={"ID":"534794d1-ed1e-4a3e-a094-ae6acb566bdc","Type":"ContainerStarted","Data":"5ce8f2800edbe20213c25efd408264a776f625f2035fde11ea888503423c796c"} Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.585005 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" event={"ID":"2d74a4af-60f4-4a8e-9778-3be8fe163205","Type":"ContainerStarted","Data":"cecde3d38e23ec24561abb06e8c88fbe0c956b2717024e47aa5d2d340bf92e19"} Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.587132 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"267af18993022cdf5f9c6e81dcc0aef8fb504e2903073aeba30c80f728ee0c35"} Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.807256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.814342 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/108c08d8-4320-4227-ae03-933609bda4c0-memberlist\") pod \"speaker-mt84d\" (UID: \"108c08d8-4320-4227-ae03-933609bda4c0\") " pod="metallb-system/speaker-mt84d" Nov 28 13:32:21 crc kubenswrapper[4747]: I1128 13:32:21.939738 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mt84d" Nov 28 13:32:21 crc kubenswrapper[4747]: W1128 13:32:21.964651 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod108c08d8_4320_4227_ae03_933609bda4c0.slice/crio-49edd0ba881b52bd62b2265b78d8130e6df5f8b0676352b8f42564a8ecd585e5 WatchSource:0}: Error finding container 49edd0ba881b52bd62b2265b78d8130e6df5f8b0676352b8f42564a8ecd585e5: Status 404 returned error can't find the container with id 49edd0ba881b52bd62b2265b78d8130e6df5f8b0676352b8f42564a8ecd585e5 Nov 28 13:32:22 crc kubenswrapper[4747]: I1128 13:32:22.606177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mt84d" event={"ID":"108c08d8-4320-4227-ae03-933609bda4c0","Type":"ContainerStarted","Data":"5f996566994eb4c5db5edf5d90dc22016eb3dd4747cc69dd3063ab216239ac61"} Nov 28 13:32:22 crc kubenswrapper[4747]: I1128 13:32:22.606483 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mt84d" event={"ID":"108c08d8-4320-4227-ae03-933609bda4c0","Type":"ContainerStarted","Data":"49edd0ba881b52bd62b2265b78d8130e6df5f8b0676352b8f42564a8ecd585e5"} Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.649441 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mt84d" Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.650002 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.650015 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mt84d" event={"ID":"108c08d8-4320-4227-ae03-933609bda4c0","Type":"ContainerStarted","Data":"feae3fded0d885054c1c694969982f9bffae4004e04715f52d9b4f77d7227658"} Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.650033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pqb5v" event={"ID":"534794d1-ed1e-4a3e-a094-ae6acb566bdc","Type":"ContainerStarted","Data":"c6d780dfb08435d04d47a6b2c14cd1ee9c261da67ffd2e6dc960dee9f57f13f8"} Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.683687 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mt84d" podStartSLOduration=2.997240265 podStartE2EDuration="5.683664356s" podCreationTimestamp="2025-11-28 13:32:20 +0000 UTC" firstStartedPulling="2025-11-28 13:32:22.171275942 +0000 UTC m=+794.833757672" lastFinishedPulling="2025-11-28 13:32:24.857700033 +0000 UTC m=+797.520181763" observedRunningTime="2025-11-28 13:32:25.674513118 +0000 UTC m=+798.336994858" watchObservedRunningTime="2025-11-28 13:32:25.683664356 +0000 UTC m=+798.346146096" Nov 28 13:32:25 crc kubenswrapper[4747]: I1128 13:32:25.695654 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-pqb5v" podStartSLOduration=2.254429447 podStartE2EDuration="5.695629369s" podCreationTimestamp="2025-11-28 13:32:20 +0000 UTC" firstStartedPulling="2025-11-28 13:32:21.411236231 +0000 UTC m=+794.073717961" lastFinishedPulling="2025-11-28 13:32:24.852436153 +0000 UTC m=+797.514917883" observedRunningTime="2025-11-28 13:32:25.691240829 +0000 UTC m=+798.353722569" watchObservedRunningTime="2025-11-28 13:32:25.695629369 +0000 UTC m=+798.358111109" Nov 28 13:32:28 crc kubenswrapper[4747]: I1128 13:32:28.667681 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ad103ad-636e-440b-924d-7b59aa875aa4" containerID="1e91797549a1a76cfae1aa6df0cd364429dd6bc6f08a0388d4a61f12d7986744" exitCode=0 Nov 28 13:32:28 crc kubenswrapper[4747]: I1128 13:32:28.667745 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerDied","Data":"1e91797549a1a76cfae1aa6df0cd364429dd6bc6f08a0388d4a61f12d7986744"} Nov 28 13:32:28 crc kubenswrapper[4747]: I1128 13:32:28.669464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" event={"ID":"2d74a4af-60f4-4a8e-9778-3be8fe163205","Type":"ContainerStarted","Data":"d9da78004080fe77fa8467de91ca3036d65da4508830841f7da071e18fa21cc0"} Nov 28 13:32:28 crc kubenswrapper[4747]: I1128 13:32:28.670084 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:28 crc kubenswrapper[4747]: I1128 13:32:28.720042 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" podStartSLOduration=1.5063198880000002 podStartE2EDuration="8.720021742s" podCreationTimestamp="2025-11-28 13:32:20 +0000 UTC" firstStartedPulling="2025-11-28 13:32:21.191702698 +0000 UTC m=+793.854184428" lastFinishedPulling="2025-11-28 13:32:28.405404512 +0000 UTC m=+801.067886282" observedRunningTime="2025-11-28 13:32:28.718743182 +0000 UTC m=+801.381224932" watchObservedRunningTime="2025-11-28 13:32:28.720021742 +0000 UTC m=+801.382503472" Nov 28 13:32:29 crc kubenswrapper[4747]: I1128 13:32:29.683274 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ad103ad-636e-440b-924d-7b59aa875aa4" containerID="95fa2f53530ed4d01e8ebcf6eb67f45051aa8fe21d5aa0fdbe49b3cb486242d7" exitCode=0 Nov 28 13:32:29 crc kubenswrapper[4747]: I1128 13:32:29.684425 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerDied","Data":"95fa2f53530ed4d01e8ebcf6eb67f45051aa8fe21d5aa0fdbe49b3cb486242d7"} Nov 28 13:32:30 crc kubenswrapper[4747]: I1128 13:32:30.693499 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ad103ad-636e-440b-924d-7b59aa875aa4" containerID="fe3f7e7fffbcbc4b7a814d5ab8652cccf339a706b4e65f9bec52d60a9d0865cb" exitCode=0 Nov 28 13:32:30 crc kubenswrapper[4747]: I1128 13:32:30.693589 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerDied","Data":"fe3f7e7fffbcbc4b7a814d5ab8652cccf339a706b4e65f9bec52d60a9d0865cb"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.064230 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-pqb5v" Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.705201 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"0d90700fddf64f118d076db2a08a7097ecc229184639e55c48fabb74c26826a1"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.705740 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.705758 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"52228c4d215644c9cae69cb5c534be0b5edf00511a347a44ad3a742c48b42eb7"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.705774 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"5ec3b5d7cd82131e41082c0b10d94177bf8a63d1c1751f91b54ab3be96b12409"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.705790 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"aff1e15726e301d05c864f3dca8801548187ec1d66fb1996922bcd4833a7b0db"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.706849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"8a9e748b04e7d1ce08720890cec791310fc0fca69587fdf08b6fc0efceb2d941"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.706868 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xvnqh" event={"ID":"1ad103ad-636e-440b-924d-7b59aa875aa4","Type":"ContainerStarted","Data":"01ebb6c60a719035878e2e4cbc3c6a9835b6a82f883a27a123650838ec361fc8"} Nov 28 13:32:31 crc kubenswrapper[4747]: I1128 13:32:31.735480 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xvnqh" podStartSLOduration=5.421172534 podStartE2EDuration="12.73545361s" podCreationTimestamp="2025-11-28 13:32:19 +0000 UTC" firstStartedPulling="2025-11-28 13:32:21.079633364 +0000 UTC m=+793.742115094" lastFinishedPulling="2025-11-28 13:32:28.39391441 +0000 UTC m=+801.056396170" observedRunningTime="2025-11-28 13:32:31.727768575 +0000 UTC m=+804.390250335" watchObservedRunningTime="2025-11-28 13:32:31.73545361 +0000 UTC m=+804.397935380" Nov 28 13:32:35 crc kubenswrapper[4747]: I1128 13:32:35.961277 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:36 crc kubenswrapper[4747]: I1128 13:32:36.033133 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:40 crc kubenswrapper[4747]: I1128 13:32:40.946197 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r7gtv" Nov 28 13:32:40 crc kubenswrapper[4747]: I1128 13:32:40.965425 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xvnqh" Nov 28 13:32:41 crc kubenswrapper[4747]: I1128 13:32:41.945327 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mt84d" Nov 28 13:32:44 crc kubenswrapper[4747]: I1128 13:32:44.915639 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:44 crc kubenswrapper[4747]: I1128 13:32:44.917342 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:44 crc kubenswrapper[4747]: I1128 13:32:44.922402 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-tft95" Nov 28 13:32:44 crc kubenswrapper[4747]: I1128 13:32:44.926505 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:44 crc kubenswrapper[4747]: I1128 13:32:44.993936 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhdx\" (UniqueName: \"kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx\") pod \"infra-operator-index-wqwh4\" (UID: \"cb52369d-85b1-459d-afa3-c60bdf2ad7cf\") " pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:45 crc kubenswrapper[4747]: I1128 13:32:45.095097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhdx\" (UniqueName: \"kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx\") pod \"infra-operator-index-wqwh4\" (UID: \"cb52369d-85b1-459d-afa3-c60bdf2ad7cf\") " pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:45 crc kubenswrapper[4747]: I1128 13:32:45.123071 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhdx\" (UniqueName: \"kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx\") pod \"infra-operator-index-wqwh4\" (UID: \"cb52369d-85b1-459d-afa3-c60bdf2ad7cf\") " pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:45 crc kubenswrapper[4747]: I1128 13:32:45.246314 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:45 crc kubenswrapper[4747]: I1128 13:32:45.680074 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:45 crc kubenswrapper[4747]: I1128 13:32:45.803515 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wqwh4" event={"ID":"cb52369d-85b1-459d-afa3-c60bdf2ad7cf","Type":"ContainerStarted","Data":"64bdebd59c32b03506e145bdcc246e26264f145f75d3cb13c6aa866cec2f4f09"} Nov 28 13:32:47 crc kubenswrapper[4747]: I1128 13:32:47.822686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wqwh4" event={"ID":"cb52369d-85b1-459d-afa3-c60bdf2ad7cf","Type":"ContainerStarted","Data":"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71"} Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.103322 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-wqwh4" podStartSLOduration=3.938242518 podStartE2EDuration="5.103287066s" podCreationTimestamp="2025-11-28 13:32:44 +0000 UTC" firstStartedPulling="2025-11-28 13:32:45.691805963 +0000 UTC m=+818.354287713" lastFinishedPulling="2025-11-28 13:32:46.856850511 +0000 UTC m=+819.519332261" observedRunningTime="2025-11-28 13:32:47.853973333 +0000 UTC m=+820.516455103" watchObservedRunningTime="2025-11-28 13:32:49.103287066 +0000 UTC m=+821.765768826" Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.111105 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.714095 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.714816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.728445 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.837645 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-wqwh4" podUID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" containerName="registry-server" containerID="cri-o://27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71" gracePeriod=2 Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.863807 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97xg\" (UniqueName: \"kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg\") pod \"infra-operator-index-psl9q\" (UID: \"0d062986-fe87-4371-8ead-8bdb1ebe83ac\") " pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:32:49 crc kubenswrapper[4747]: I1128 13:32:49.965234 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97xg\" (UniqueName: \"kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg\") pod \"infra-operator-index-psl9q\" (UID: \"0d062986-fe87-4371-8ead-8bdb1ebe83ac\") " pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.002231 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97xg\" (UniqueName: \"kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg\") pod \"infra-operator-index-psl9q\" (UID: \"0d062986-fe87-4371-8ead-8bdb1ebe83ac\") " pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.042406 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.223151 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.370197 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqhdx\" (UniqueName: \"kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx\") pod \"cb52369d-85b1-459d-afa3-c60bdf2ad7cf\" (UID: \"cb52369d-85b1-459d-afa3-c60bdf2ad7cf\") " Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.373917 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx" (OuterVolumeSpecName: "kube-api-access-xqhdx") pod "cb52369d-85b1-459d-afa3-c60bdf2ad7cf" (UID: "cb52369d-85b1-459d-afa3-c60bdf2ad7cf"). InnerVolumeSpecName "kube-api-access-xqhdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.446080 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:32:50 crc kubenswrapper[4747]: W1128 13:32:50.454368 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d062986_fe87_4371_8ead_8bdb1ebe83ac.slice/crio-801b7f0bee06238b8ce2bc489a8347a2d9085dd59852c811bc80d5f053f81350 WatchSource:0}: Error finding container 801b7f0bee06238b8ce2bc489a8347a2d9085dd59852c811bc80d5f053f81350: Status 404 returned error can't find the container with id 801b7f0bee06238b8ce2bc489a8347a2d9085dd59852c811bc80d5f053f81350 Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.471985 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqhdx\" (UniqueName: \"kubernetes.io/projected/cb52369d-85b1-459d-afa3-c60bdf2ad7cf-kube-api-access-xqhdx\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.847051 4747 generic.go:334] "Generic (PLEG): container finished" podID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" containerID="27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71" exitCode=0 Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.847135 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wqwh4" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.847142 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wqwh4" event={"ID":"cb52369d-85b1-459d-afa3-c60bdf2ad7cf","Type":"ContainerDied","Data":"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71"} Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.847303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wqwh4" event={"ID":"cb52369d-85b1-459d-afa3-c60bdf2ad7cf","Type":"ContainerDied","Data":"64bdebd59c32b03506e145bdcc246e26264f145f75d3cb13c6aa866cec2f4f09"} Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.847336 4747 scope.go:117] "RemoveContainer" containerID="27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.848421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-psl9q" event={"ID":"0d062986-fe87-4371-8ead-8bdb1ebe83ac","Type":"ContainerStarted","Data":"801b7f0bee06238b8ce2bc489a8347a2d9085dd59852c811bc80d5f053f81350"} Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.865408 4747 scope.go:117] "RemoveContainer" containerID="27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71" Nov 28 13:32:50 crc kubenswrapper[4747]: E1128 13:32:50.866133 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71\": container with ID starting with 27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71 not found: ID does not exist" containerID="27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.866237 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71"} err="failed to get container status \"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71\": rpc error: code = NotFound desc = could not find container \"27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71\": container with ID starting with 27114eff2b36a7d804b122984538849ae5c006153e9e0b474f8267799fa2df71 not found: ID does not exist" Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.901339 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:50 crc kubenswrapper[4747]: I1128 13:32:50.914018 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-wqwh4"] Nov 28 13:32:51 crc kubenswrapper[4747]: I1128 13:32:51.655870 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" path="/var/lib/kubelet/pods/cb52369d-85b1-459d-afa3-c60bdf2ad7cf/volumes" Nov 28 13:32:51 crc kubenswrapper[4747]: I1128 13:32:51.858887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-psl9q" event={"ID":"0d062986-fe87-4371-8ead-8bdb1ebe83ac","Type":"ContainerStarted","Data":"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35"} Nov 28 13:32:51 crc kubenswrapper[4747]: I1128 13:32:51.883007 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-psl9q" podStartSLOduration=2.411673725 podStartE2EDuration="2.882972841s" podCreationTimestamp="2025-11-28 13:32:49 +0000 UTC" firstStartedPulling="2025-11-28 13:32:50.461123065 +0000 UTC m=+823.123604795" lastFinishedPulling="2025-11-28 13:32:50.932422181 +0000 UTC m=+823.594903911" observedRunningTime="2025-11-28 13:32:51.87988754 +0000 UTC m=+824.542369340" watchObservedRunningTime="2025-11-28 13:32:51.882972841 +0000 UTC m=+824.545454611" Nov 28 13:33:00 crc kubenswrapper[4747]: I1128 13:33:00.043292 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:33:00 crc kubenswrapper[4747]: I1128 13:33:00.045068 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:33:00 crc kubenswrapper[4747]: I1128 13:33:00.077535 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:33:00 crc kubenswrapper[4747]: I1128 13:33:00.966677 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:33:02 crc kubenswrapper[4747]: I1128 13:33:02.955809 4747 generic.go:334] "Generic (PLEG): container finished" podID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerID="67f8ee89b90ec1bc94d6660de3a0d7513231c0e92f0296d258674e7e2b8eb714" exitCode=1 Nov 28 13:33:02 crc kubenswrapper[4747]: I1128 13:33:02.955909 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerDied","Data":"67f8ee89b90ec1bc94d6660de3a0d7513231c0e92f0296d258674e7e2b8eb714"} Nov 28 13:33:02 crc kubenswrapper[4747]: I1128 13:33:02.957628 4747 scope.go:117] "RemoveContainer" containerID="67f8ee89b90ec1bc94d6660de3a0d7513231c0e92f0296d258674e7e2b8eb714" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.151992 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw"] Nov 28 13:33:03 crc kubenswrapper[4747]: E1128 13:33:03.152282 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" containerName="registry-server" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.152295 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" containerName="registry-server" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.152391 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb52369d-85b1-459d-afa3-c60bdf2ad7cf" containerName="registry-server" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.153109 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.162642 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw"] Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.162813 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-stc2j" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.256941 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpks\" (UniqueName: \"kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.256981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.257004 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.358551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.358779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpks\" (UniqueName: \"kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.358824 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.359015 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.359458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.398280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpks\" (UniqueName: \"kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.480367 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.759957 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw"] Nov 28 13:33:03 crc kubenswrapper[4747]: W1128 13:33:03.763028 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ae9e4a_ff17_4203_95f7_de7d9690f798.slice/crio-ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a WatchSource:0}: Error finding container ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a: Status 404 returned error can't find the container with id ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.965278 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerStarted","Data":"783e4bfae6db909013827700759c2c336710875785103324248e08a5f6cc6fef"} Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.966260 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.967601 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerStarted","Data":"d573ccf19009ccc301aa54cb94f8e0e88b14c27df227b9f97373b1d45c740572"} Nov 28 13:33:03 crc kubenswrapper[4747]: I1128 13:33:03.967659 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerStarted","Data":"ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a"} Nov 28 13:33:04 crc kubenswrapper[4747]: I1128 13:33:04.977841 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerID="d573ccf19009ccc301aa54cb94f8e0e88b14c27df227b9f97373b1d45c740572" exitCode=0 Nov 28 13:33:04 crc kubenswrapper[4747]: I1128 13:33:04.977932 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerDied","Data":"d573ccf19009ccc301aa54cb94f8e0e88b14c27df227b9f97373b1d45c740572"} Nov 28 13:33:05 crc kubenswrapper[4747]: I1128 13:33:05.986166 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerID="85f8bfaa90ebdc70b3f219f16bc03f84b04c7b335222cdadc00a9780550f57c0" exitCode=0 Nov 28 13:33:05 crc kubenswrapper[4747]: I1128 13:33:05.986226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerDied","Data":"85f8bfaa90ebdc70b3f219f16bc03f84b04c7b335222cdadc00a9780550f57c0"} Nov 28 13:33:06 crc kubenswrapper[4747]: I1128 13:33:06.998871 4747 generic.go:334] "Generic (PLEG): container finished" podID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerID="523d2a6c9fec4e2b181e95a4324d357e6f29e48cc4dc06cc3a211cc59184d1fd" exitCode=0 Nov 28 13:33:06 crc kubenswrapper[4747]: I1128 13:33:06.998943 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerDied","Data":"523d2a6c9fec4e2b181e95a4324d357e6f29e48cc4dc06cc3a211cc59184d1fd"} Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.251453 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.338012 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util\") pod \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.338489 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle\") pod \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.338591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hpks\" (UniqueName: \"kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks\") pod \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\" (UID: \"c6ae9e4a-ff17-4203-95f7-de7d9690f798\") " Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.340101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle" (OuterVolumeSpecName: "bundle") pod "c6ae9e4a-ff17-4203-95f7-de7d9690f798" (UID: "c6ae9e4a-ff17-4203-95f7-de7d9690f798"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.346851 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks" (OuterVolumeSpecName: "kube-api-access-6hpks") pod "c6ae9e4a-ff17-4203-95f7-de7d9690f798" (UID: "c6ae9e4a-ff17-4203-95f7-de7d9690f798"). InnerVolumeSpecName "kube-api-access-6hpks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.358706 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util" (OuterVolumeSpecName: "util") pod "c6ae9e4a-ff17-4203-95f7-de7d9690f798" (UID: "c6ae9e4a-ff17-4203-95f7-de7d9690f798"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.440556 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.440622 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ae9e4a-ff17-4203-95f7-de7d9690f798-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:08 crc kubenswrapper[4747]: I1128 13:33:08.440649 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hpks\" (UniqueName: \"kubernetes.io/projected/c6ae9e4a-ff17-4203-95f7-de7d9690f798-kube-api-access-6hpks\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:09 crc kubenswrapper[4747]: I1128 13:33:09.016957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" event={"ID":"c6ae9e4a-ff17-4203-95f7-de7d9690f798","Type":"ContainerDied","Data":"ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a"} Nov 28 13:33:09 crc kubenswrapper[4747]: I1128 13:33:09.017003 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a" Nov 28 13:33:09 crc kubenswrapper[4747]: I1128 13:33:09.017097 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw" Nov 28 13:33:09 crc kubenswrapper[4747]: E1128 13:33:09.221235 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ae9e4a_ff17_4203_95f7_de7d9690f798.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ae9e4a_ff17_4203_95f7_de7d9690f798.slice/crio-ac843b163acf18e6ded68059e6fbd3c7c90f44f65033333ad8781355c69e990a\": RecentStats: unable to find data in memory cache]" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.714843 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:11 crc kubenswrapper[4747]: E1128 13:33:11.715498 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="util" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.715519 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="util" Nov 28 13:33:11 crc kubenswrapper[4747]: E1128 13:33:11.715536 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="pull" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.715549 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="pull" Nov 28 13:33:11 crc kubenswrapper[4747]: E1128 13:33:11.715578 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="extract" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.715592 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="extract" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.715777 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" containerName="extract" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.718822 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.743280 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.792061 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.792170 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.792264 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.893138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.893416 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.893515 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.894026 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.894053 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:11 crc kubenswrapper[4747]: I1128 13:33:11.913703 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9\") pod \"redhat-marketplace-xshx7\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:12 crc kubenswrapper[4747]: I1128 13:33:12.055280 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:12 crc kubenswrapper[4747]: I1128 13:33:12.554850 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:12 crc kubenswrapper[4747]: W1128 13:33:12.564249 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47bedd84_cccc_4cab_92c5_fa085b0871e1.slice/crio-11e30d56ed23b77e8801afc84198ef876ed9472dc055de64a339d4bdda03fa31 WatchSource:0}: Error finding container 11e30d56ed23b77e8801afc84198ef876ed9472dc055de64a339d4bdda03fa31: Status 404 returned error can't find the container with id 11e30d56ed23b77e8801afc84198ef876ed9472dc055de64a339d4bdda03fa31 Nov 28 13:33:13 crc kubenswrapper[4747]: I1128 13:33:13.046988 4747 generic.go:334] "Generic (PLEG): container finished" podID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerID="735d99fa61171ccf5041dd8da3d2be19d1594e2a5d1ffb5086783c9eabcf9f83" exitCode=0 Nov 28 13:33:13 crc kubenswrapper[4747]: I1128 13:33:13.047101 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerDied","Data":"735d99fa61171ccf5041dd8da3d2be19d1594e2a5d1ffb5086783c9eabcf9f83"} Nov 28 13:33:13 crc kubenswrapper[4747]: I1128 13:33:13.047438 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerStarted","Data":"11e30d56ed23b77e8801afc84198ef876ed9472dc055de64a339d4bdda03fa31"} Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.063897 4747 generic.go:334] "Generic (PLEG): container finished" podID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerID="caeea91abed76b80bc531eeeb0d9361d27f378d3670c3e0c3ede8cf6aba3f370" exitCode=0 Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.063957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerDied","Data":"caeea91abed76b80bc531eeeb0d9361d27f378d3670c3e0c3ede8cf6aba3f370"} Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.761160 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.770658 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.771770 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.773296 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.773434 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7qq6b" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.847312 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.862501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmc25\" (UniqueName: \"kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.862804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.862845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.963908 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmc25\" (UniqueName: \"kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.964387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.964417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.971175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.972766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:15 crc kubenswrapper[4747]: I1128 13:33:15.990035 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmc25\" (UniqueName: \"kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25\") pod \"infra-operator-controller-manager-74f9c56665-z2jpj\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:16 crc kubenswrapper[4747]: I1128 13:33:16.073037 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerStarted","Data":"238f16b2cc8dc84eb1dd13f1892902d801e3a145e88aff255d410d5f6935ede1"} Nov 28 13:33:16 crc kubenswrapper[4747]: I1128 13:33:16.089123 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:16 crc kubenswrapper[4747]: I1128 13:33:16.099649 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xshx7" podStartSLOduration=2.6559778339999998 podStartE2EDuration="5.099625956s" podCreationTimestamp="2025-11-28 13:33:11 +0000 UTC" firstStartedPulling="2025-11-28 13:33:13.048443447 +0000 UTC m=+845.710925177" lastFinishedPulling="2025-11-28 13:33:15.492091569 +0000 UTC m=+848.154573299" observedRunningTime="2025-11-28 13:33:16.097415 +0000 UTC m=+848.759896720" watchObservedRunningTime="2025-11-28 13:33:16.099625956 +0000 UTC m=+848.762107696" Nov 28 13:33:16 crc kubenswrapper[4747]: I1128 13:33:16.504394 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:33:16 crc kubenswrapper[4747]: W1128 13:33:16.511727 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37483c76_950c_49c6_a4f3_aba8c5c8c41a.slice/crio-3dbd5af8be3f50c07df09fcdba35c35f1bbd1e1ab612487718aadfdea60095ac WatchSource:0}: Error finding container 3dbd5af8be3f50c07df09fcdba35c35f1bbd1e1ab612487718aadfdea60095ac: Status 404 returned error can't find the container with id 3dbd5af8be3f50c07df09fcdba35c35f1bbd1e1ab612487718aadfdea60095ac Nov 28 13:33:17 crc kubenswrapper[4747]: I1128 13:33:17.079174 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerStarted","Data":"3dbd5af8be3f50c07df09fcdba35c35f1bbd1e1ab612487718aadfdea60095ac"} Nov 28 13:33:19 crc kubenswrapper[4747]: I1128 13:33:19.089486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerStarted","Data":"9fef260230f3c32bb0b4fad410ac9ef66f2982fdc90242d88575b04eabf519a4"} Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.055607 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.055882 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.100419 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.147909 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.148881 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.153518 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-bs4r9" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.154405 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.159673 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.160198 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.168407 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.181549 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.190217 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.208503 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.221438 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.222071 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.222378 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.227246 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.233886 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294084 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294123 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294174 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294215 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftj24\" (UniqueName: \"kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.294246 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395760 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395820 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395889 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395934 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395960 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8m6\" (UniqueName: \"kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.395990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396061 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpmzz\" (UniqueName: \"kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396108 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftj24\" (UniqueName: \"kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396153 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396187 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.396251 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.397020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.397220 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") device mount path \"/mnt/openstack/pv10\"" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.397245 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.397551 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.397796 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.417843 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.432527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftj24\" (UniqueName: \"kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24\") pod \"openstack-galera-0\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497367 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497416 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497482 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8m6\" (UniqueName: \"kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497536 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497590 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497624 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpmzz\" (UniqueName: \"kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497645 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.497707 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498438 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") device mount path \"/mnt/openstack/pv12\"" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498519 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498543 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498578 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498612 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498713 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") device mount path \"/mnt/openstack/pv01\"" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.498891 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.499012 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.499225 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.499388 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.499716 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.513315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8m6\" (UniqueName: \"kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.513404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.514038 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.515862 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpmzz\" (UniqueName: \"kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz\") pod \"openstack-galera-2\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.542081 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.559836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.861702 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.980712 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:22 crc kubenswrapper[4747]: I1128 13:33:22.984400 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:22 crc kubenswrapper[4747]: W1128 13:33:22.985311 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23bc6d14_d758_4423_9c06_37b5eeac59f6.slice/crio-bc9934fc0971080cdc7e36ce94cbb85a6097d3be858b1db4284c864843f16907 WatchSource:0}: Error finding container bc9934fc0971080cdc7e36ce94cbb85a6097d3be858b1db4284c864843f16907: Status 404 returned error can't find the container with id bc9934fc0971080cdc7e36ce94cbb85a6097d3be858b1db4284c864843f16907 Nov 28 13:33:22 crc kubenswrapper[4747]: W1128 13:33:22.985918 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54ebc7f_c262_4900_9cd3_76fd6280c5f6.slice/crio-deab8159325ca167c0dc432a991a26bcf884d337eca00d234e32228a1864267a WatchSource:0}: Error finding container deab8159325ca167c0dc432a991a26bcf884d337eca00d234e32228a1864267a: Status 404 returned error can't find the container with id deab8159325ca167c0dc432a991a26bcf884d337eca00d234e32228a1864267a Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.117961 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerStarted","Data":"1260e5c731f919f465450128a6b38ea5c3c7acdb87cb964711d72145558c854e"} Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.119868 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerStarted","Data":"cee35a56a70d10ab163196f295f134e068c85da00169c87824895f0ee1fa2bc7"} Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.120030 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.121001 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerStarted","Data":"deab8159325ca167c0dc432a991a26bcf884d337eca00d234e32228a1864267a"} Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.125436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerStarted","Data":"bc9934fc0971080cdc7e36ce94cbb85a6097d3be858b1db4284c864843f16907"} Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.125661 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:33:23 crc kubenswrapper[4747]: I1128 13:33:23.139647 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" podStartSLOduration=2.635567159 podStartE2EDuration="8.139631054s" podCreationTimestamp="2025-11-28 13:33:15 +0000 UTC" firstStartedPulling="2025-11-28 13:33:16.515118197 +0000 UTC m=+849.177599927" lastFinishedPulling="2025-11-28 13:33:22.019182092 +0000 UTC m=+854.681663822" observedRunningTime="2025-11-28 13:33:23.13871295 +0000 UTC m=+855.801194700" watchObservedRunningTime="2025-11-28 13:33:23.139631054 +0000 UTC m=+855.802112784" Nov 28 13:33:25 crc kubenswrapper[4747]: I1128 13:33:25.699632 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:25 crc kubenswrapper[4747]: I1128 13:33:25.700439 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xshx7" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="registry-server" containerID="cri-o://238f16b2cc8dc84eb1dd13f1892902d801e3a145e88aff255d410d5f6935ede1" gracePeriod=2 Nov 28 13:33:26 crc kubenswrapper[4747]: I1128 13:33:26.150610 4747 generic.go:334] "Generic (PLEG): container finished" podID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerID="238f16b2cc8dc84eb1dd13f1892902d801e3a145e88aff255d410d5f6935ede1" exitCode=0 Nov 28 13:33:26 crc kubenswrapper[4747]: I1128 13:33:26.150649 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerDied","Data":"238f16b2cc8dc84eb1dd13f1892902d801e3a145e88aff255d410d5f6935ede1"} Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.280514 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.281372 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.283637 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-mq8jd" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.283648 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.285569 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.383891 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq5w4\" (UniqueName: \"kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.383973 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.384041 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.485576 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.485641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5w4\" (UniqueName: \"kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.485696 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.486412 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.486409 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.510353 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq5w4\" (UniqueName: \"kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4\") pod \"memcached-0\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:27 crc kubenswrapper[4747]: I1128 13:33:27.598766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.105005 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.111572 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.111685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.114224 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-gqmm5" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.188084 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.195524 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xshx7" event={"ID":"47bedd84-cccc-4cab-92c5-fa085b0871e1","Type":"ContainerDied","Data":"11e30d56ed23b77e8801afc84198ef876ed9472dc055de64a339d4bdda03fa31"} Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.195564 4747 scope.go:117] "RemoveContainer" containerID="238f16b2cc8dc84eb1dd13f1892902d801e3a145e88aff255d410d5f6935ede1" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.195663 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xshx7" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.214657 4747 scope.go:117] "RemoveContainer" containerID="caeea91abed76b80bc531eeeb0d9361d27f378d3670c3e0c3ede8cf6aba3f370" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.248896 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldcf\" (UniqueName: \"kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf\") pod \"rabbitmq-cluster-operator-index-j4vv2\" (UID: \"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.277111 4747 scope.go:117] "RemoveContainer" containerID="735d99fa61171ccf5041dd8da3d2be19d1594e2a5d1ffb5086783c9eabcf9f83" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.349933 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities\") pod \"47bedd84-cccc-4cab-92c5-fa085b0871e1\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.350307 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9\") pod \"47bedd84-cccc-4cab-92c5-fa085b0871e1\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.350399 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content\") pod \"47bedd84-cccc-4cab-92c5-fa085b0871e1\" (UID: \"47bedd84-cccc-4cab-92c5-fa085b0871e1\") " Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.351167 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities" (OuterVolumeSpecName: "utilities") pod "47bedd84-cccc-4cab-92c5-fa085b0871e1" (UID: "47bedd84-cccc-4cab-92c5-fa085b0871e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.351652 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldcf\" (UniqueName: \"kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf\") pod \"rabbitmq-cluster-operator-index-j4vv2\" (UID: \"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.351741 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.357390 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9" (OuterVolumeSpecName: "kube-api-access-5rvl9") pod "47bedd84-cccc-4cab-92c5-fa085b0871e1" (UID: "47bedd84-cccc-4cab-92c5-fa085b0871e1"). InnerVolumeSpecName "kube-api-access-5rvl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.381977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldcf\" (UniqueName: \"kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf\") pod \"rabbitmq-cluster-operator-index-j4vv2\" (UID: \"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010\") " pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.382865 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47bedd84-cccc-4cab-92c5-fa085b0871e1" (UID: "47bedd84-cccc-4cab-92c5-fa085b0871e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.431093 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.453949 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bedd84-cccc-4cab-92c5-fa085b0871e1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.453977 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rvl9\" (UniqueName: \"kubernetes.io/projected/47bedd84-cccc-4cab-92c5-fa085b0871e1-kube-api-access-5rvl9\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.561152 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.579617 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xshx7"] Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.591981 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:33:31 crc kubenswrapper[4747]: W1128 13:33:31.602396 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2e6a2c7_dee9_40e2_a7fa_78038c271647.slice/crio-94299ed8f5cdecc1090f3a9d45eb976bd6228f5211203e6294ab4e5022add22f WatchSource:0}: Error finding container 94299ed8f5cdecc1090f3a9d45eb976bd6228f5211203e6294ab4e5022add22f: Status 404 returned error can't find the container with id 94299ed8f5cdecc1090f3a9d45eb976bd6228f5211203e6294ab4e5022add22f Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.654800 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" path="/var/lib/kubelet/pods/47bedd84-cccc-4cab-92c5-fa085b0871e1/volumes" Nov 28 13:33:31 crc kubenswrapper[4747]: I1128 13:33:31.959919 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:32 crc kubenswrapper[4747]: I1128 13:33:32.201013 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" event={"ID":"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010","Type":"ContainerStarted","Data":"62517172dfe71e9948d5888c3988e097f5d5ed60dc5f512b3a78a1e4b92f6a1b"} Nov 28 13:33:32 crc kubenswrapper[4747]: I1128 13:33:32.202268 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerStarted","Data":"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e"} Nov 28 13:33:32 crc kubenswrapper[4747]: I1128 13:33:32.204525 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b2e6a2c7-dee9-40e2-a7fa-78038c271647","Type":"ContainerStarted","Data":"94299ed8f5cdecc1090f3a9d45eb976bd6228f5211203e6294ab4e5022add22f"} Nov 28 13:33:32 crc kubenswrapper[4747]: I1128 13:33:32.206040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerStarted","Data":"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9"} Nov 28 13:33:32 crc kubenswrapper[4747]: I1128 13:33:32.207474 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerStarted","Data":"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d"} Nov 28 13:33:36 crc kubenswrapper[4747]: I1128 13:33:36.242982 4747 generic.go:334] "Generic (PLEG): container finished" podID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerID="7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d" exitCode=0 Nov 28 13:33:36 crc kubenswrapper[4747]: I1128 13:33:36.243062 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerDied","Data":"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d"} Nov 28 13:33:36 crc kubenswrapper[4747]: I1128 13:33:36.902108 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.250856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b2e6a2c7-dee9-40e2-a7fa-78038c271647","Type":"ContainerStarted","Data":"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d"} Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.250912 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.252844 4747 generic.go:334] "Generic (PLEG): container finished" podID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerID="3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9" exitCode=0 Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.252912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerDied","Data":"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9"} Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.254986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerStarted","Data":"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91"} Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.256809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" event={"ID":"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010","Type":"ContainerStarted","Data":"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4"} Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.258513 4747 generic.go:334] "Generic (PLEG): container finished" podID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerID="bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e" exitCode=0 Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.258569 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerDied","Data":"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e"} Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.277780 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=6.400428662 podStartE2EDuration="10.27775981s" podCreationTimestamp="2025-11-28 13:33:27 +0000 UTC" firstStartedPulling="2025-11-28 13:33:31.60875228 +0000 UTC m=+864.271234010" lastFinishedPulling="2025-11-28 13:33:35.486083428 +0000 UTC m=+868.148565158" observedRunningTime="2025-11-28 13:33:37.276459577 +0000 UTC m=+869.938941337" watchObservedRunningTime="2025-11-28 13:33:37.27775981 +0000 UTC m=+869.940241540" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.302738 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" podStartSLOduration=2.022191561 podStartE2EDuration="6.302716566s" podCreationTimestamp="2025-11-28 13:33:31 +0000 UTC" firstStartedPulling="2025-11-28 13:33:31.964509309 +0000 UTC m=+864.626991039" lastFinishedPulling="2025-11-28 13:33:36.245034274 +0000 UTC m=+868.907516044" observedRunningTime="2025-11-28 13:33:37.300068419 +0000 UTC m=+869.962550169" watchObservedRunningTime="2025-11-28 13:33:37.302716566 +0000 UTC m=+869.965198296" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.704941 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=8.443151648 podStartE2EDuration="16.704922589s" podCreationTimestamp="2025-11-28 13:33:21 +0000 UTC" firstStartedPulling="2025-11-28 13:33:22.988045299 +0000 UTC m=+855.650527029" lastFinishedPulling="2025-11-28 13:33:31.24981623 +0000 UTC m=+863.912297970" observedRunningTime="2025-11-28 13:33:37.376289152 +0000 UTC m=+870.038770912" watchObservedRunningTime="2025-11-28 13:33:37.704922589 +0000 UTC m=+870.367404329" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.706974 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:33:37 crc kubenswrapper[4747]: E1128 13:33:37.707258 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="extract-content" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.707279 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="extract-content" Nov 28 13:33:37 crc kubenswrapper[4747]: E1128 13:33:37.707298 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="extract-utilities" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.707308 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="extract-utilities" Nov 28 13:33:37 crc kubenswrapper[4747]: E1128 13:33:37.707329 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="registry-server" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.707337 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="registry-server" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.707469 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bedd84-cccc-4cab-92c5-fa085b0871e1" containerName="registry-server" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.707936 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.716549 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.749650 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zl5d\" (UniqueName: \"kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d\") pod \"rabbitmq-cluster-operator-index-tdss5\" (UID: \"40bbf056-1cae-46c3-94a8-2f74f517cf31\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.851728 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zl5d\" (UniqueName: \"kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d\") pod \"rabbitmq-cluster-operator-index-tdss5\" (UID: \"40bbf056-1cae-46c3-94a8-2f74f517cf31\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:37 crc kubenswrapper[4747]: I1128 13:33:37.878023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zl5d\" (UniqueName: \"kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d\") pod \"rabbitmq-cluster-operator-index-tdss5\" (UID: \"40bbf056-1cae-46c3-94a8-2f74f517cf31\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.022536 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.273967 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerStarted","Data":"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706"} Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.288970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerStarted","Data":"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59"} Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.289093 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" podUID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" containerName="registry-server" containerID="cri-o://5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4" gracePeriod=2 Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.298692 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=8.951464645 podStartE2EDuration="17.298674264s" podCreationTimestamp="2025-11-28 13:33:21 +0000 UTC" firstStartedPulling="2025-11-28 13:33:22.987517976 +0000 UTC m=+855.649999706" lastFinishedPulling="2025-11-28 13:33:31.334727595 +0000 UTC m=+863.997209325" observedRunningTime="2025-11-28 13:33:38.292189368 +0000 UTC m=+870.954671108" watchObservedRunningTime="2025-11-28 13:33:38.298674264 +0000 UTC m=+870.961155994" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.326193 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=8.946749565 podStartE2EDuration="17.326177605s" podCreationTimestamp="2025-11-28 13:33:21 +0000 UTC" firstStartedPulling="2025-11-28 13:33:22.873029788 +0000 UTC m=+855.535511518" lastFinishedPulling="2025-11-28 13:33:31.252457818 +0000 UTC m=+863.914939558" observedRunningTime="2025-11-28 13:33:38.320605543 +0000 UTC m=+870.983087283" watchObservedRunningTime="2025-11-28 13:33:38.326177605 +0000 UTC m=+870.988659335" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.468549 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.759479 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.865845 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldcf\" (UniqueName: \"kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf\") pod \"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010\" (UID: \"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010\") " Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.873409 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf" (OuterVolumeSpecName: "kube-api-access-qldcf") pod "8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" (UID: "8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010"). InnerVolumeSpecName "kube-api-access-qldcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:33:38 crc kubenswrapper[4747]: I1128 13:33:38.967745 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldcf\" (UniqueName: \"kubernetes.io/projected/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010-kube-api-access-qldcf\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.296444 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" event={"ID":"40bbf056-1cae-46c3-94a8-2f74f517cf31","Type":"ContainerStarted","Data":"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317"} Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.296490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" event={"ID":"40bbf056-1cae-46c3-94a8-2f74f517cf31","Type":"ContainerStarted","Data":"81a838177dded7b9ddf89028cd1ccab66320adf644eb48403170d0accb490739"} Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.299418 4747 generic.go:334] "Generic (PLEG): container finished" podID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" containerID="5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4" exitCode=0 Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.299470 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" event={"ID":"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010","Type":"ContainerDied","Data":"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4"} Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.299495 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" event={"ID":"8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010","Type":"ContainerDied","Data":"62517172dfe71e9948d5888c3988e097f5d5ed60dc5f512b3a78a1e4b92f6a1b"} Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.299512 4747 scope.go:117] "RemoveContainer" containerID="5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.299538 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-j4vv2" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.314450 4747 scope.go:117] "RemoveContainer" containerID="5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4" Nov 28 13:33:39 crc kubenswrapper[4747]: E1128 13:33:39.314762 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4\": container with ID starting with 5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4 not found: ID does not exist" containerID="5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.314799 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4"} err="failed to get container status \"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4\": rpc error: code = NotFound desc = could not find container \"5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4\": container with ID starting with 5a679f83ea622117c6b8f19d671a2771ec9143bb983a668c042dac84fce373d4 not found: ID does not exist" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.337064 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" podStartSLOduration=1.901793478 podStartE2EDuration="2.337044573s" podCreationTimestamp="2025-11-28 13:33:37 +0000 UTC" firstStartedPulling="2025-11-28 13:33:38.483231988 +0000 UTC m=+871.145713718" lastFinishedPulling="2025-11-28 13:33:38.918483083 +0000 UTC m=+871.580964813" observedRunningTime="2025-11-28 13:33:39.32710713 +0000 UTC m=+871.989588880" watchObservedRunningTime="2025-11-28 13:33:39.337044573 +0000 UTC m=+871.999526313" Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.343564 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.347251 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-j4vv2"] Nov 28 13:33:39 crc kubenswrapper[4747]: I1128 13:33:39.655526 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" path="/var/lib/kubelet/pods/8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010/volumes" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.499605 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.499951 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.542226 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.542279 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.561604 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.561668 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:42 crc kubenswrapper[4747]: I1128 13:33:42.600753 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:33:44 crc kubenswrapper[4747]: I1128 13:33:44.636881 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:44 crc kubenswrapper[4747]: I1128 13:33:44.752645 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4747]: I1128 13:33:47.633300 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:33:47 crc kubenswrapper[4747]: I1128 13:33:47.634021 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:33:48 crc kubenswrapper[4747]: I1128 13:33:48.022743 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:48 crc kubenswrapper[4747]: I1128 13:33:48.023016 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:48 crc kubenswrapper[4747]: I1128 13:33:48.054327 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:48 crc kubenswrapper[4747]: I1128 13:33:48.387867 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:33:52 crc kubenswrapper[4747]: I1128 13:33:52.637991 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="galera" probeResult="failure" output=< Nov 28 13:33:52 crc kubenswrapper[4747]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 28 13:33:52 crc kubenswrapper[4747]: > Nov 28 13:33:57 crc kubenswrapper[4747]: I1128 13:33:57.132234 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:57 crc kubenswrapper[4747]: I1128 13:33:57.224933 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:34:00 crc kubenswrapper[4747]: I1128 13:34:00.715798 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:34:00 crc kubenswrapper[4747]: I1128 13:34:00.801315 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.576272 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj"] Nov 28 13:34:02 crc kubenswrapper[4747]: E1128 13:34:02.576883 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" containerName="registry-server" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.576901 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" containerName="registry-server" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.577051 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a65ab43-3fe7-443e-8ee3-1d3e8bdb8010" containerName="registry-server" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.578156 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.580486 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-stc2j" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.596188 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj"] Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.635423 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.635477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.635518 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5bwp\" (UniqueName: \"kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.736675 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.736780 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.736827 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5bwp\" (UniqueName: \"kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.737531 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.737752 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.758845 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5bwp\" (UniqueName: \"kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:02 crc kubenswrapper[4747]: I1128 13:34:02.948518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:03 crc kubenswrapper[4747]: I1128 13:34:03.375967 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj"] Nov 28 13:34:03 crc kubenswrapper[4747]: I1128 13:34:03.475948 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" event={"ID":"75538f50-4039-471c-842f-85941607c65e","Type":"ContainerStarted","Data":"ffb8f2215a3d33425e1c1ce16cc3109068e6fcf67cbcb84d4527ec588fd1eb30"} Nov 28 13:34:04 crc kubenswrapper[4747]: I1128 13:34:04.486858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" event={"ID":"75538f50-4039-471c-842f-85941607c65e","Type":"ContainerDied","Data":"ffce513af174caf5f25e9ef78dfddd59ecade449acc0bf2440b313b6184e57ee"} Nov 28 13:34:04 crc kubenswrapper[4747]: I1128 13:34:04.486673 4747 generic.go:334] "Generic (PLEG): container finished" podID="75538f50-4039-471c-842f-85941607c65e" containerID="ffce513af174caf5f25e9ef78dfddd59ecade449acc0bf2440b313b6184e57ee" exitCode=0 Nov 28 13:34:06 crc kubenswrapper[4747]: I1128 13:34:06.506409 4747 generic.go:334] "Generic (PLEG): container finished" podID="75538f50-4039-471c-842f-85941607c65e" containerID="4f0e292ecff44610335134bb94e5a6fdb5e432feb44a6b8f722d58c62b85b4ca" exitCode=0 Nov 28 13:34:06 crc kubenswrapper[4747]: I1128 13:34:06.506505 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" event={"ID":"75538f50-4039-471c-842f-85941607c65e","Type":"ContainerDied","Data":"4f0e292ecff44610335134bb94e5a6fdb5e432feb44a6b8f722d58c62b85b4ca"} Nov 28 13:34:07 crc kubenswrapper[4747]: I1128 13:34:07.516327 4747 generic.go:334] "Generic (PLEG): container finished" podID="75538f50-4039-471c-842f-85941607c65e" containerID="d64055eb6ce40a045ae28e56e44b66b5ff9a14a19933189dce9ebafa3c8bcad7" exitCode=0 Nov 28 13:34:07 crc kubenswrapper[4747]: I1128 13:34:07.516390 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" event={"ID":"75538f50-4039-471c-842f-85941607c65e","Type":"ContainerDied","Data":"d64055eb6ce40a045ae28e56e44b66b5ff9a14a19933189dce9ebafa3c8bcad7"} Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.861451 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.931820 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle\") pod \"75538f50-4039-471c-842f-85941607c65e\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.931930 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5bwp\" (UniqueName: \"kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp\") pod \"75538f50-4039-471c-842f-85941607c65e\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.932111 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util\") pod \"75538f50-4039-471c-842f-85941607c65e\" (UID: \"75538f50-4039-471c-842f-85941607c65e\") " Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.933334 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle" (OuterVolumeSpecName: "bundle") pod "75538f50-4039-471c-842f-85941607c65e" (UID: "75538f50-4039-471c-842f-85941607c65e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.936835 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp" (OuterVolumeSpecName: "kube-api-access-n5bwp") pod "75538f50-4039-471c-842f-85941607c65e" (UID: "75538f50-4039-471c-842f-85941607c65e"). InnerVolumeSpecName "kube-api-access-n5bwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:08 crc kubenswrapper[4747]: I1128 13:34:08.944734 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util" (OuterVolumeSpecName: "util") pod "75538f50-4039-471c-842f-85941607c65e" (UID: "75538f50-4039-471c-842f-85941607c65e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.034367 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5bwp\" (UniqueName: \"kubernetes.io/projected/75538f50-4039-471c-842f-85941607c65e-kube-api-access-n5bwp\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.034423 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.034446 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75538f50-4039-471c-842f-85941607c65e-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.533638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" event={"ID":"75538f50-4039-471c-842f-85941607c65e","Type":"ContainerDied","Data":"ffb8f2215a3d33425e1c1ce16cc3109068e6fcf67cbcb84d4527ec588fd1eb30"} Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.533682 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb8f2215a3d33425e1c1ce16cc3109068e6fcf67cbcb84d4527ec588fd1eb30" Nov 28 13:34:09 crc kubenswrapper[4747]: I1128 13:34:09.533724 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.632776 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.633152 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.663287 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:34:17 crc kubenswrapper[4747]: E1128 13:34:17.663838 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="util" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.663929 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="util" Nov 28 13:34:17 crc kubenswrapper[4747]: E1128 13:34:17.664030 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="extract" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.664137 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="extract" Nov 28 13:34:17 crc kubenswrapper[4747]: E1128 13:34:17.664272 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="pull" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.664386 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="pull" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.664686 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="75538f50-4039-471c-842f-85941607c65e" containerName="extract" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.665361 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.668251 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-4fpfs" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.675018 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.761190 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8p8\" (UniqueName: \"kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8\") pod \"rabbitmq-cluster-operator-779fc9694b-7sfzm\" (UID: \"2d18bad5-470c-4359-b98f-0cfcabfe8694\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.862908 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8p8\" (UniqueName: \"kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8\") pod \"rabbitmq-cluster-operator-779fc9694b-7sfzm\" (UID: \"2d18bad5-470c-4359-b98f-0cfcabfe8694\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.903759 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8p8\" (UniqueName: \"kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8\") pod \"rabbitmq-cluster-operator-779fc9694b-7sfzm\" (UID: \"2d18bad5-470c-4359-b98f-0cfcabfe8694\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:34:17 crc kubenswrapper[4747]: I1128 13:34:17.988550 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:34:18 crc kubenswrapper[4747]: I1128 13:34:18.407249 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:34:18 crc kubenswrapper[4747]: I1128 13:34:18.590015 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" event={"ID":"2d18bad5-470c-4359-b98f-0cfcabfe8694","Type":"ContainerStarted","Data":"6ca07a2427eb6e3130432646a1e617579336a1ac5802e7481e60f08fab5ae71f"} Nov 28 13:34:23 crc kubenswrapper[4747]: I1128 13:34:23.655516 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" event={"ID":"2d18bad5-470c-4359-b98f-0cfcabfe8694","Type":"ContainerStarted","Data":"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a"} Nov 28 13:34:23 crc kubenswrapper[4747]: I1128 13:34:23.671975 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" podStartSLOduration=2.348656761 podStartE2EDuration="6.671953367s" podCreationTimestamp="2025-11-28 13:34:17 +0000 UTC" firstStartedPulling="2025-11-28 13:34:18.417401062 +0000 UTC m=+911.079882822" lastFinishedPulling="2025-11-28 13:34:22.740697698 +0000 UTC m=+915.403179428" observedRunningTime="2025-11-28 13:34:23.66852948 +0000 UTC m=+916.331011210" watchObservedRunningTime="2025-11-28 13:34:23.671953367 +0000 UTC m=+916.334435107" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.053788 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.058599 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.060448 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.060763 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.060855 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.060977 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.061066 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-lsbkt" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.074060 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.132785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.132849 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.132880 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.132899 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.132934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.133101 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.133142 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbgx\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.133173 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234281 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234436 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234453 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbgx\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234507 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234574 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.234965 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.235739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.237907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.244021 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.244434 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.244507 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be1da50e494e40cd306c179d9195febc6e59a7687f41f39d4655605f7d457d71/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.245744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.248307 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.260349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbgx\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.290699 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") pod \"rabbitmq-server-0\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.376725 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:30 crc kubenswrapper[4747]: I1128 13:34:30.802759 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.701727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerStarted","Data":"1c755292d5a0ee7204948a3105058ca194406000ab108e552908244d27dd6359"} Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.713261 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.714106 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.717167 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-zrf94" Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.721390 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.857965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qkm\" (UniqueName: \"kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm\") pod \"keystone-operator-index-k9qdh\" (UID: \"7c7a81ba-8c14-470c-9506-313e10edd72b\") " pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.960693 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qkm\" (UniqueName: \"kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm\") pod \"keystone-operator-index-k9qdh\" (UID: \"7c7a81ba-8c14-470c-9506-313e10edd72b\") " pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:31 crc kubenswrapper[4747]: I1128 13:34:31.983594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qkm\" (UniqueName: \"kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm\") pod \"keystone-operator-index-k9qdh\" (UID: \"7c7a81ba-8c14-470c-9506-313e10edd72b\") " pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:32 crc kubenswrapper[4747]: I1128 13:34:32.033882 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:32 crc kubenswrapper[4747]: I1128 13:34:32.465000 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:32 crc kubenswrapper[4747]: W1128 13:34:32.471618 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7a81ba_8c14_470c_9506_313e10edd72b.slice/crio-794768a4ef6d19f4a26e9594dbb5e452fb530014b6802d25637860d6c185b01f WatchSource:0}: Error finding container 794768a4ef6d19f4a26e9594dbb5e452fb530014b6802d25637860d6c185b01f: Status 404 returned error can't find the container with id 794768a4ef6d19f4a26e9594dbb5e452fb530014b6802d25637860d6c185b01f Nov 28 13:34:32 crc kubenswrapper[4747]: I1128 13:34:32.708666 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k9qdh" event={"ID":"7c7a81ba-8c14-470c-9506-313e10edd72b","Type":"ContainerStarted","Data":"794768a4ef6d19f4a26e9594dbb5e452fb530014b6802d25637860d6c185b01f"} Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.109928 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.722749 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.724282 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.728151 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.835331 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5qd\" (UniqueName: \"kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd\") pod \"keystone-operator-index-g2f9t\" (UID: \"c8f39d39-8a82-4e51-9a4c-81d2476e5d42\") " pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.937741 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5qd\" (UniqueName: \"kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd\") pod \"keystone-operator-index-g2f9t\" (UID: \"c8f39d39-8a82-4e51-9a4c-81d2476e5d42\") " pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:36 crc kubenswrapper[4747]: I1128 13:34:36.954517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5qd\" (UniqueName: \"kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd\") pod \"keystone-operator-index-g2f9t\" (UID: \"c8f39d39-8a82-4e51-9a4c-81d2476e5d42\") " pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:37 crc kubenswrapper[4747]: I1128 13:34:37.052345 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:38 crc kubenswrapper[4747]: I1128 13:34:38.155239 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:34:38 crc kubenswrapper[4747]: W1128 13:34:38.953234 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f39d39_8a82_4e51_9a4c_81d2476e5d42.slice/crio-839d7668a63cec36e123955c9b45b8a47232b98178e47d38414e9a8b180bcb30 WatchSource:0}: Error finding container 839d7668a63cec36e123955c9b45b8a47232b98178e47d38414e9a8b180bcb30: Status 404 returned error can't find the container with id 839d7668a63cec36e123955c9b45b8a47232b98178e47d38414e9a8b180bcb30 Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.761040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-g2f9t" event={"ID":"c8f39d39-8a82-4e51-9a4c-81d2476e5d42","Type":"ContainerStarted","Data":"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b"} Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.761108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-g2f9t" event={"ID":"c8f39d39-8a82-4e51-9a4c-81d2476e5d42","Type":"ContainerStarted","Data":"839d7668a63cec36e123955c9b45b8a47232b98178e47d38414e9a8b180bcb30"} Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.762786 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k9qdh" event={"ID":"7c7a81ba-8c14-470c-9506-313e10edd72b","Type":"ContainerStarted","Data":"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8"} Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.762923 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-k9qdh" podUID="7c7a81ba-8c14-470c-9506-313e10edd72b" containerName="registry-server" containerID="cri-o://1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8" gracePeriod=2 Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.791965 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-g2f9t" podStartSLOduration=3.503071879 podStartE2EDuration="3.791946863s" podCreationTimestamp="2025-11-28 13:34:36 +0000 UTC" firstStartedPulling="2025-11-28 13:34:38.955812729 +0000 UTC m=+931.618294459" lastFinishedPulling="2025-11-28 13:34:39.244687673 +0000 UTC m=+931.907169443" observedRunningTime="2025-11-28 13:34:39.789597273 +0000 UTC m=+932.452079013" watchObservedRunningTime="2025-11-28 13:34:39.791946863 +0000 UTC m=+932.454428603" Nov 28 13:34:39 crc kubenswrapper[4747]: I1128 13:34:39.830404 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-k9qdh" podStartSLOduration=2.065047947 podStartE2EDuration="8.830371073s" podCreationTimestamp="2025-11-28 13:34:31 +0000 UTC" firstStartedPulling="2025-11-28 13:34:32.473762734 +0000 UTC m=+925.136244474" lastFinishedPulling="2025-11-28 13:34:39.23908586 +0000 UTC m=+931.901567600" observedRunningTime="2025-11-28 13:34:39.81888515 +0000 UTC m=+932.481366950" watchObservedRunningTime="2025-11-28 13:34:39.830371073 +0000 UTC m=+932.492852843" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.375265 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.486999 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qkm\" (UniqueName: \"kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm\") pod \"7c7a81ba-8c14-470c-9506-313e10edd72b\" (UID: \"7c7a81ba-8c14-470c-9506-313e10edd72b\") " Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.494239 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm" (OuterVolumeSpecName: "kube-api-access-45qkm") pod "7c7a81ba-8c14-470c-9506-313e10edd72b" (UID: "7c7a81ba-8c14-470c-9506-313e10edd72b"). InnerVolumeSpecName "kube-api-access-45qkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.588248 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qkm\" (UniqueName: \"kubernetes.io/projected/7c7a81ba-8c14-470c-9506-313e10edd72b-kube-api-access-45qkm\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.773001 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c7a81ba-8c14-470c-9506-313e10edd72b" containerID="1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8" exitCode=0 Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.773120 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k9qdh" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.773179 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k9qdh" event={"ID":"7c7a81ba-8c14-470c-9506-313e10edd72b","Type":"ContainerDied","Data":"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8"} Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.773275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k9qdh" event={"ID":"7c7a81ba-8c14-470c-9506-313e10edd72b","Type":"ContainerDied","Data":"794768a4ef6d19f4a26e9594dbb5e452fb530014b6802d25637860d6c185b01f"} Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.773305 4747 scope.go:117] "RemoveContainer" containerID="1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.779304 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerStarted","Data":"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d"} Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.799027 4747 scope.go:117] "RemoveContainer" containerID="1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8" Nov 28 13:34:40 crc kubenswrapper[4747]: E1128 13:34:40.799639 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8\": container with ID starting with 1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8 not found: ID does not exist" containerID="1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.799699 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8"} err="failed to get container status \"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8\": rpc error: code = NotFound desc = could not find container \"1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8\": container with ID starting with 1aebf168116e71ecdaac7a57b8d0447c4334cc0aa0aad9efeae45675baca84a8 not found: ID does not exist" Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.832777 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:40 crc kubenswrapper[4747]: I1128 13:34:40.839793 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-k9qdh"] Nov 28 13:34:41 crc kubenswrapper[4747]: I1128 13:34:41.657316 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7a81ba-8c14-470c-9506-313e10edd72b" path="/var/lib/kubelet/pods/7c7a81ba-8c14-470c-9506-313e10edd72b/volumes" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.053122 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.054414 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.083182 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.633337 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.633446 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.633518 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.634649 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.634757 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb" gracePeriod=600 Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.830604 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb" exitCode=0 Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.830684 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb"} Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.831006 4747 scope.go:117] "RemoveContainer" containerID="5fdc296405c58b503731bb8ebbd3318202226659d1222af8e629d5358c8f2a8d" Nov 28 13:34:47 crc kubenswrapper[4747]: I1128 13:34:47.865641 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:34:48 crc kubenswrapper[4747]: I1128 13:34:48.841563 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5"} Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.966052 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm"] Nov 28 13:34:49 crc kubenswrapper[4747]: E1128 13:34:49.966777 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7a81ba-8c14-470c-9506-313e10edd72b" containerName="registry-server" Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.966795 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7a81ba-8c14-470c-9506-313e10edd72b" containerName="registry-server" Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.966964 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7a81ba-8c14-470c-9506-313e10edd72b" containerName="registry-server" Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.968313 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.970469 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-stc2j" Nov 28 13:34:49 crc kubenswrapper[4747]: I1128 13:34:49.977726 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm"] Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.027981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4htl\" (UniqueName: \"kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.028076 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.028306 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.129579 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.129672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.130406 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.130414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.130500 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4htl\" (UniqueName: \"kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.150556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4htl\" (UniqueName: \"kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl\") pod \"35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.289693 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.772962 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm"] Nov 28 13:34:50 crc kubenswrapper[4747]: I1128 13:34:50.864764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" event={"ID":"721907f7-badd-4ac4-aba2-b2915ea6a9cb","Type":"ContainerStarted","Data":"0d07bd1079b2da1b3f5c8915de8e4b7cb07c207645f1857215e8f7a3935a2e30"} Nov 28 13:34:51 crc kubenswrapper[4747]: I1128 13:34:51.876962 4747 generic.go:334] "Generic (PLEG): container finished" podID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerID="bb137ca1adf7e14e1f95e919f1b67891652d5a13194ec27b7ac6e5993d70ecef" exitCode=0 Nov 28 13:34:51 crc kubenswrapper[4747]: I1128 13:34:51.877043 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" event={"ID":"721907f7-badd-4ac4-aba2-b2915ea6a9cb","Type":"ContainerDied","Data":"bb137ca1adf7e14e1f95e919f1b67891652d5a13194ec27b7ac6e5993d70ecef"} Nov 28 13:34:53 crc kubenswrapper[4747]: I1128 13:34:53.891380 4747 generic.go:334] "Generic (PLEG): container finished" podID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerID="08917f44aced52183a52e119529e2ccea1cb11fdb3cdfc61c818ef39b0d713c9" exitCode=0 Nov 28 13:34:53 crc kubenswrapper[4747]: I1128 13:34:53.891471 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" event={"ID":"721907f7-badd-4ac4-aba2-b2915ea6a9cb","Type":"ContainerDied","Data":"08917f44aced52183a52e119529e2ccea1cb11fdb3cdfc61c818ef39b0d713c9"} Nov 28 13:34:54 crc kubenswrapper[4747]: I1128 13:34:54.904466 4747 generic.go:334] "Generic (PLEG): container finished" podID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerID="c00c8da01d7c8c5acd03b8d257df4f8996c95894d29cf735e5d8f4fe15cd1cdc" exitCode=0 Nov 28 13:34:54 crc kubenswrapper[4747]: I1128 13:34:54.904518 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" event={"ID":"721907f7-badd-4ac4-aba2-b2915ea6a9cb","Type":"ContainerDied","Data":"c00c8da01d7c8c5acd03b8d257df4f8996c95894d29cf735e5d8f4fe15cd1cdc"} Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.256028 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.286710 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4htl\" (UniqueName: \"kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl\") pod \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.286932 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle\") pod \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.287002 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util\") pod \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\" (UID: \"721907f7-badd-4ac4-aba2-b2915ea6a9cb\") " Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.288360 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle" (OuterVolumeSpecName: "bundle") pod "721907f7-badd-4ac4-aba2-b2915ea6a9cb" (UID: "721907f7-badd-4ac4-aba2-b2915ea6a9cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.295185 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl" (OuterVolumeSpecName: "kube-api-access-w4htl") pod "721907f7-badd-4ac4-aba2-b2915ea6a9cb" (UID: "721907f7-badd-4ac4-aba2-b2915ea6a9cb"). InnerVolumeSpecName "kube-api-access-w4htl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.388897 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.388946 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4htl\" (UniqueName: \"kubernetes.io/projected/721907f7-badd-4ac4-aba2-b2915ea6a9cb-kube-api-access-w4htl\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.441234 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util" (OuterVolumeSpecName: "util") pod "721907f7-badd-4ac4-aba2-b2915ea6a9cb" (UID: "721907f7-badd-4ac4-aba2-b2915ea6a9cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.491184 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/721907f7-badd-4ac4-aba2-b2915ea6a9cb-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.923350 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" event={"ID":"721907f7-badd-4ac4-aba2-b2915ea6a9cb","Type":"ContainerDied","Data":"0d07bd1079b2da1b3f5c8915de8e4b7cb07c207645f1857215e8f7a3935a2e30"} Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.923788 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d07bd1079b2da1b3f5c8915de8e4b7cb07c207645f1857215e8f7a3935a2e30" Nov 28 13:34:56 crc kubenswrapper[4747]: I1128 13:34:56.923440 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.716046 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:35:05 crc kubenswrapper[4747]: E1128 13:35:05.716816 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="util" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.716827 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="util" Nov 28 13:35:05 crc kubenswrapper[4747]: E1128 13:35:05.716849 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="pull" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.716855 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="pull" Nov 28 13:35:05 crc kubenswrapper[4747]: E1128 13:35:05.716871 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="extract" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.716877 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="extract" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.716990 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" containerName="extract" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.717408 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.719431 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2ws6b" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.719480 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.732729 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.838758 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbgc\" (UniqueName: \"kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.839096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.839156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.941131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.941347 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.941476 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbgc\" (UniqueName: \"kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.951281 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.959429 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:05 crc kubenswrapper[4747]: I1128 13:35:05.960011 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbgc\" (UniqueName: \"kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc\") pod \"keystone-operator-controller-manager-589f96b8dd-d5xbd\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:06 crc kubenswrapper[4747]: I1128 13:35:06.032717 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:06 crc kubenswrapper[4747]: I1128 13:35:06.527632 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:35:07 crc kubenswrapper[4747]: I1128 13:35:07.000296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" event={"ID":"8e64428a-5763-44ae-87c3-e45ba2c3a039","Type":"ContainerStarted","Data":"ceee4b7d7d250a26816cc0e06323bab5f9788e1db8a6a83d7d88c0f0c7878684"} Nov 28 13:35:12 crc kubenswrapper[4747]: I1128 13:35:12.036816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" event={"ID":"8e64428a-5763-44ae-87c3-e45ba2c3a039","Type":"ContainerStarted","Data":"7cb94193d218edb947e0d91e36be5b76272f7596495c4c5388a8a911bdd00bcb"} Nov 28 13:35:12 crc kubenswrapper[4747]: I1128 13:35:12.037414 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:12 crc kubenswrapper[4747]: I1128 13:35:12.056944 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" podStartSLOduration=2.6025967359999997 podStartE2EDuration="7.056921791s" podCreationTimestamp="2025-11-28 13:35:05 +0000 UTC" firstStartedPulling="2025-11-28 13:35:06.540923633 +0000 UTC m=+959.203405383" lastFinishedPulling="2025-11-28 13:35:10.995248708 +0000 UTC m=+963.657730438" observedRunningTime="2025-11-28 13:35:12.051714778 +0000 UTC m=+964.714196518" watchObservedRunningTime="2025-11-28 13:35:12.056921791 +0000 UTC m=+964.719403531" Nov 28 13:35:13 crc kubenswrapper[4747]: I1128 13:35:13.046340 4747 generic.go:334] "Generic (PLEG): container finished" podID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerID="0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d" exitCode=0 Nov 28 13:35:13 crc kubenswrapper[4747]: I1128 13:35:13.046385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerDied","Data":"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d"} Nov 28 13:35:14 crc kubenswrapper[4747]: I1128 13:35:14.057599 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerStarted","Data":"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9"} Nov 28 13:35:14 crc kubenswrapper[4747]: I1128 13:35:14.058314 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:35:14 crc kubenswrapper[4747]: I1128 13:35:14.090690 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.662227785 podStartE2EDuration="45.090659104s" podCreationTimestamp="2025-11-28 13:34:29 +0000 UTC" firstStartedPulling="2025-11-28 13:34:30.809997704 +0000 UTC m=+923.472479434" lastFinishedPulling="2025-11-28 13:34:39.238429023 +0000 UTC m=+931.900910753" observedRunningTime="2025-11-28 13:35:14.084940118 +0000 UTC m=+966.747421848" watchObservedRunningTime="2025-11-28 13:35:14.090659104 +0000 UTC m=+966.753140874" Nov 28 13:35:16 crc kubenswrapper[4747]: I1128 13:35:16.037703 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.784482 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn"] Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.785934 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.788810 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.802725 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn"] Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.890411 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdcmw\" (UniqueName: \"kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.890649 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.890688 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-lz8sc"] Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.891695 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.895358 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-lz8sc"] Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.991777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.991885 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqw4\" (UniqueName: \"kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.991988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.992026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdcmw\" (UniqueName: \"kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:23 crc kubenswrapper[4747]: I1128 13:35:23.993015 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.021422 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdcmw\" (UniqueName: \"kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw\") pod \"keystone-5daa-account-create-update-g2qvn\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.093609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqw4\" (UniqueName: \"kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.094054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.094715 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.109645 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqw4\" (UniqueName: \"kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4\") pod \"keystone-db-create-lz8sc\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.112518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.207095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.518849 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn"] Nov 28 13:35:24 crc kubenswrapper[4747]: I1128 13:35:24.613889 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-lz8sc"] Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.133436 4747 generic.go:334] "Generic (PLEG): container finished" podID="a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" containerID="fbd9a16b5ed2077c637d764ab51a37cb259d5ef712746a7062f927806e0630da" exitCode=0 Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.133513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" event={"ID":"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e","Type":"ContainerDied","Data":"fbd9a16b5ed2077c637d764ab51a37cb259d5ef712746a7062f927806e0630da"} Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.133836 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" event={"ID":"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e","Type":"ContainerStarted","Data":"8a50846cb29105b37a2e8fe82296875e424c2698a14048d7a62c0787ee02dfe9"} Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.135724 4747 generic.go:334] "Generic (PLEG): container finished" podID="9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" containerID="8cb92922f206b39e1e7636a950f851fbe747745a892cffe8e947a72194a2549b" exitCode=0 Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.135949 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" event={"ID":"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e","Type":"ContainerDied","Data":"8cb92922f206b39e1e7636a950f851fbe747745a892cffe8e947a72194a2549b"} Nov 28 13:35:25 crc kubenswrapper[4747]: I1128 13:35:25.136052 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" event={"ID":"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e","Type":"ContainerStarted","Data":"9787a1cf30d34881ec66c6aa46d97a0ae514f082c1109bc6347f64909df70b73"} Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.498655 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.503830 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.629279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts\") pod \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.629497 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdcmw\" (UniqueName: \"kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw\") pod \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.629625 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts\") pod \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\" (UID: \"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e\") " Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.629709 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkqw4\" (UniqueName: \"kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4\") pod \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\" (UID: \"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e\") " Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.631348 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" (UID: "9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.631488 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" (UID: "a1f9a105-84d9-4be5-b9af-b5c2cfd5761e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.641682 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw" (OuterVolumeSpecName: "kube-api-access-tdcmw") pod "9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" (UID: "9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e"). InnerVolumeSpecName "kube-api-access-tdcmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.642524 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4" (OuterVolumeSpecName: "kube-api-access-rkqw4") pod "a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" (UID: "a1f9a105-84d9-4be5-b9af-b5c2cfd5761e"). InnerVolumeSpecName "kube-api-access-rkqw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.732584 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdcmw\" (UniqueName: \"kubernetes.io/projected/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-kube-api-access-tdcmw\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.732651 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.732669 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkqw4\" (UniqueName: \"kubernetes.io/projected/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-kube-api-access-rkqw4\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:26 crc kubenswrapper[4747]: I1128 13:35:26.732683 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.153178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" event={"ID":"a1f9a105-84d9-4be5-b9af-b5c2cfd5761e","Type":"ContainerDied","Data":"8a50846cb29105b37a2e8fe82296875e424c2698a14048d7a62c0787ee02dfe9"} Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.153240 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a50846cb29105b37a2e8fe82296875e424c2698a14048d7a62c0787ee02dfe9" Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.153742 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-lz8sc" Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.155033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" event={"ID":"9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e","Type":"ContainerDied","Data":"9787a1cf30d34881ec66c6aa46d97a0ae514f082c1109bc6347f64909df70b73"} Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.155081 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9787a1cf30d34881ec66c6aa46d97a0ae514f082c1109bc6347f64909df70b73" Nov 28 13:35:27 crc kubenswrapper[4747]: I1128 13:35:27.155094 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.380815 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.936713 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6q82p"] Nov 28 13:35:30 crc kubenswrapper[4747]: E1128 13:35:30.937347 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" containerName="mariadb-account-create-update" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.937368 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" containerName="mariadb-account-create-update" Nov 28 13:35:30 crc kubenswrapper[4747]: E1128 13:35:30.937387 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" containerName="mariadb-database-create" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.937396 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" containerName="mariadb-database-create" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.937542 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" containerName="mariadb-database-create" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.937566 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" containerName="mariadb-account-create-update" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.938098 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.941106 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.941724 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.941775 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rtwxn" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.941851 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:35:30 crc kubenswrapper[4747]: I1128 13:35:30.949531 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6q82p"] Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.012042 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.012147 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6lw\" (UniqueName: \"kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.113752 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6lw\" (UniqueName: \"kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.113860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.133299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6lw\" (UniqueName: \"kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.134501 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data\") pod \"keystone-db-sync-6q82p\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.260849 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.672932 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6q82p"] Nov 28 13:35:31 crc kubenswrapper[4747]: W1128 13:35:31.684072 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c3a5419_e275_4c88_96e2_712ada1896b9.slice/crio-af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897 WatchSource:0}: Error finding container af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897: Status 404 returned error can't find the container with id af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897 Nov 28 13:35:31 crc kubenswrapper[4747]: I1128 13:35:31.687772 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:35:32 crc kubenswrapper[4747]: I1128 13:35:32.190138 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" event={"ID":"6c3a5419-e275-4c88-96e2-712ada1896b9","Type":"ContainerStarted","Data":"af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897"} Nov 28 13:35:42 crc kubenswrapper[4747]: I1128 13:35:42.280876 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" event={"ID":"6c3a5419-e275-4c88-96e2-712ada1896b9","Type":"ContainerStarted","Data":"cdb37435c168413a7f19caad7a26b380ce64cddffee70ca39260303d74d95124"} Nov 28 13:35:42 crc kubenswrapper[4747]: I1128 13:35:42.306896 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" podStartSLOduration=2.3321929900000002 podStartE2EDuration="12.306857485s" podCreationTimestamp="2025-11-28 13:35:30 +0000 UTC" firstStartedPulling="2025-11-28 13:35:31.687579104 +0000 UTC m=+984.350060834" lastFinishedPulling="2025-11-28 13:35:41.662243589 +0000 UTC m=+994.324725329" observedRunningTime="2025-11-28 13:35:42.297530429 +0000 UTC m=+994.960012169" watchObservedRunningTime="2025-11-28 13:35:42.306857485 +0000 UTC m=+994.969339255" Nov 28 13:35:46 crc kubenswrapper[4747]: I1128 13:35:46.317514 4747 generic.go:334] "Generic (PLEG): container finished" podID="6c3a5419-e275-4c88-96e2-712ada1896b9" containerID="cdb37435c168413a7f19caad7a26b380ce64cddffee70ca39260303d74d95124" exitCode=0 Nov 28 13:35:46 crc kubenswrapper[4747]: I1128 13:35:46.317619 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" event={"ID":"6c3a5419-e275-4c88-96e2-712ada1896b9","Type":"ContainerDied","Data":"cdb37435c168413a7f19caad7a26b380ce64cddffee70ca39260303d74d95124"} Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.632159 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.790618 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data\") pod \"6c3a5419-e275-4c88-96e2-712ada1896b9\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.791269 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx6lw\" (UniqueName: \"kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw\") pod \"6c3a5419-e275-4c88-96e2-712ada1896b9\" (UID: \"6c3a5419-e275-4c88-96e2-712ada1896b9\") " Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.802949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw" (OuterVolumeSpecName: "kube-api-access-rx6lw") pod "6c3a5419-e275-4c88-96e2-712ada1896b9" (UID: "6c3a5419-e275-4c88-96e2-712ada1896b9"). InnerVolumeSpecName "kube-api-access-rx6lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.846855 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data" (OuterVolumeSpecName: "config-data") pod "6c3a5419-e275-4c88-96e2-712ada1896b9" (UID: "6c3a5419-e275-4c88-96e2-712ada1896b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.893863 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3a5419-e275-4c88-96e2-712ada1896b9-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:47 crc kubenswrapper[4747]: I1128 13:35:47.893919 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx6lw\" (UniqueName: \"kubernetes.io/projected/6c3a5419-e275-4c88-96e2-712ada1896b9-kube-api-access-rx6lw\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.333714 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" event={"ID":"6c3a5419-e275-4c88-96e2-712ada1896b9","Type":"ContainerDied","Data":"af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897"} Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.333758 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af39c23aae13d2ce63150b80080b76b0eec81ebdc17fc526f3c4bdc203d28897" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.333810 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6q82p" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.563528 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-77z47"] Nov 28 13:35:48 crc kubenswrapper[4747]: E1128 13:35:48.563883 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3a5419-e275-4c88-96e2-712ada1896b9" containerName="keystone-db-sync" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.563908 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3a5419-e275-4c88-96e2-712ada1896b9" containerName="keystone-db-sync" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.564105 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3a5419-e275-4c88-96e2-712ada1896b9" containerName="keystone-db-sync" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.564725 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.568249 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.568499 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.569920 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.573721 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.575796 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-77z47"] Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.584296 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rtwxn" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.707826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.707959 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.708048 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.708077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.708127 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db655\" (UniqueName: \"kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.809357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.809415 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.809450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.809476 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.809498 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db655\" (UniqueName: \"kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.814553 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.815021 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.816058 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.817677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.839595 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db655\" (UniqueName: \"kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655\") pod \"keystone-bootstrap-77z47\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:48 crc kubenswrapper[4747]: I1128 13:35:48.885969 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:49 crc kubenswrapper[4747]: I1128 13:35:49.107701 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-77z47"] Nov 28 13:35:49 crc kubenswrapper[4747]: I1128 13:35:49.343364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" event={"ID":"37e85120-b1df-47d7-854b-2898eb3b78fb","Type":"ContainerStarted","Data":"3bcba4fbd89bfb37e905f045754db6be1edfaf916f9595498a0fa14508e9702d"} Nov 28 13:35:49 crc kubenswrapper[4747]: I1128 13:35:49.343834 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" event={"ID":"37e85120-b1df-47d7-854b-2898eb3b78fb","Type":"ContainerStarted","Data":"74deecbe487c5a1959ca045d42a13b3b290b6cff759ef2c22af95b4e51704059"} Nov 28 13:35:49 crc kubenswrapper[4747]: I1128 13:35:49.369534 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" podStartSLOduration=1.369508428 podStartE2EDuration="1.369508428s" podCreationTimestamp="2025-11-28 13:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:35:49.364184333 +0000 UTC m=+1002.026666073" watchObservedRunningTime="2025-11-28 13:35:49.369508428 +0000 UTC m=+1002.031990168" Nov 28 13:35:51 crc kubenswrapper[4747]: E1128 13:35:51.904881 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e85120_b1df_47d7_854b_2898eb3b78fb.slice/crio-3bcba4fbd89bfb37e905f045754db6be1edfaf916f9595498a0fa14508e9702d.scope\": RecentStats: unable to find data in memory cache]" Nov 28 13:35:52 crc kubenswrapper[4747]: I1128 13:35:52.364075 4747 generic.go:334] "Generic (PLEG): container finished" podID="37e85120-b1df-47d7-854b-2898eb3b78fb" containerID="3bcba4fbd89bfb37e905f045754db6be1edfaf916f9595498a0fa14508e9702d" exitCode=0 Nov 28 13:35:52 crc kubenswrapper[4747]: I1128 13:35:52.364130 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" event={"ID":"37e85120-b1df-47d7-854b-2898eb3b78fb","Type":"ContainerDied","Data":"3bcba4fbd89bfb37e905f045754db6be1edfaf916f9595498a0fa14508e9702d"} Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.675850 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.782929 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db655\" (UniqueName: \"kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655\") pod \"37e85120-b1df-47d7-854b-2898eb3b78fb\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.783045 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys\") pod \"37e85120-b1df-47d7-854b-2898eb3b78fb\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.783081 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts\") pod \"37e85120-b1df-47d7-854b-2898eb3b78fb\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.783139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys\") pod \"37e85120-b1df-47d7-854b-2898eb3b78fb\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.783273 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data\") pod \"37e85120-b1df-47d7-854b-2898eb3b78fb\" (UID: \"37e85120-b1df-47d7-854b-2898eb3b78fb\") " Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.789110 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts" (OuterVolumeSpecName: "scripts") pod "37e85120-b1df-47d7-854b-2898eb3b78fb" (UID: "37e85120-b1df-47d7-854b-2898eb3b78fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.789545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "37e85120-b1df-47d7-854b-2898eb3b78fb" (UID: "37e85120-b1df-47d7-854b-2898eb3b78fb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.790352 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655" (OuterVolumeSpecName: "kube-api-access-db655") pod "37e85120-b1df-47d7-854b-2898eb3b78fb" (UID: "37e85120-b1df-47d7-854b-2898eb3b78fb"). InnerVolumeSpecName "kube-api-access-db655". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.790409 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "37e85120-b1df-47d7-854b-2898eb3b78fb" (UID: "37e85120-b1df-47d7-854b-2898eb3b78fb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.805563 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data" (OuterVolumeSpecName: "config-data") pod "37e85120-b1df-47d7-854b-2898eb3b78fb" (UID: "37e85120-b1df-47d7-854b-2898eb3b78fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.886904 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.886967 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db655\" (UniqueName: \"kubernetes.io/projected/37e85120-b1df-47d7-854b-2898eb3b78fb-kube-api-access-db655\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.886985 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.887001 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:53 crc kubenswrapper[4747]: I1128 13:35:53.887014 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37e85120-b1df-47d7-854b-2898eb3b78fb-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.381638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" event={"ID":"37e85120-b1df-47d7-854b-2898eb3b78fb","Type":"ContainerDied","Data":"74deecbe487c5a1959ca045d42a13b3b290b6cff759ef2c22af95b4e51704059"} Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.381940 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74deecbe487c5a1959ca045d42a13b3b290b6cff759ef2c22af95b4e51704059" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.381734 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-77z47" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.484176 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:35:54 crc kubenswrapper[4747]: E1128 13:35:54.485191 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e85120-b1df-47d7-854b-2898eb3b78fb" containerName="keystone-bootstrap" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.485270 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e85120-b1df-47d7-854b-2898eb3b78fb" containerName="keystone-bootstrap" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.485762 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e85120-b1df-47d7-854b-2898eb3b78fb" containerName="keystone-bootstrap" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.487049 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.495817 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-rtwxn" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.497138 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.497556 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.497793 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.501085 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.595362 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.595476 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.595545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxpn\" (UniqueName: \"kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.595630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.595660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.697529 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.697710 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxpn\" (UniqueName: \"kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.697810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.697863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.697960 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.703750 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.704380 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.704653 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.711560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.717077 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxpn\" (UniqueName: \"kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn\") pod \"keystone-5ff69df7c6-7pqcv\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:54 crc kubenswrapper[4747]: I1128 13:35:54.809061 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:35:55 crc kubenswrapper[4747]: I1128 13:35:55.313037 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:35:55 crc kubenswrapper[4747]: I1128 13:35:55.405134 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" event={"ID":"fb313a7b-10f0-44e0-b947-f9bc23011537","Type":"ContainerStarted","Data":"27f9d7ad179fc8b1a25e6a37e4175a3239cd9cc4c8722bfa44bb0bb11f36fc80"} Nov 28 13:35:56 crc kubenswrapper[4747]: I1128 13:35:56.415263 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" event={"ID":"fb313a7b-10f0-44e0-b947-f9bc23011537","Type":"ContainerStarted","Data":"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded"} Nov 28 13:35:56 crc kubenswrapper[4747]: I1128 13:35:56.415656 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:36:26 crc kubenswrapper[4747]: I1128 13:36:26.188491 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:36:26 crc kubenswrapper[4747]: I1128 13:36:26.210792 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" podStartSLOduration=32.210771975 podStartE2EDuration="32.210771975s" podCreationTimestamp="2025-11-28 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:35:56.440415901 +0000 UTC m=+1009.102897691" watchObservedRunningTime="2025-11-28 13:36:26.210771975 +0000 UTC m=+1038.873253705" Nov 28 13:36:26 crc kubenswrapper[4747]: E1128 13:36:26.895853 4747 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-5ff69df7c6-7pqcv_fb313a7b-10f0-44e0-b947-f9bc23011537/keystone-api/0.log" line={} Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.265499 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.267476 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.276318 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:27 crc kubenswrapper[4747]: E1128 13:36:27.349262 4747 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-5ff69df7c6-7pqcv_fb313a7b-10f0-44e0-b947-f9bc23011537/keystone-api/0.log" line={} Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.378081 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.378195 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.378263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.378334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.378366 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-256tr\" (UniqueName: \"kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.479773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.479858 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.479885 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.479931 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.479959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-256tr\" (UniqueName: \"kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.485916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.486415 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.486486 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.486986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.501614 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-256tr\" (UniqueName: \"kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr\") pod \"keystone-844f8fd76b-mwp8x\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:27 crc kubenswrapper[4747]: I1128 13:36:27.591111 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.068432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.710780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" event={"ID":"26279628-22ee-48c6-a0c9-873cdb6e9cf1","Type":"ContainerStarted","Data":"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe"} Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.711152 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" event={"ID":"26279628-22ee-48c6-a0c9-873cdb6e9cf1","Type":"ContainerStarted","Data":"2e8252742143738547a161146eb9b8c94e69cb2b6167ade22b1f7b625c0f6806"} Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.711181 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.767405 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" podStartSLOduration=1.767375736 podStartE2EDuration="1.767375736s" podCreationTimestamp="2025-11-28 13:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:28.758810639 +0000 UTC m=+1041.421292409" watchObservedRunningTime="2025-11-28 13:36:28.767375736 +0000 UTC m=+1041.429857476" Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.792791 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6q82p"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.824513 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-77z47"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.842164 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6q82p"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.849164 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-77z47"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.856609 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.856839 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" podUID="fb313a7b-10f0-44e0-b947-f9bc23011537" containerName="keystone-api" containerID="cri-o://5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded" gracePeriod=30 Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.862667 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone5daa-account-delete-7pjkk"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.864491 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.868606 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:28 crc kubenswrapper[4747]: I1128 13:36:28.872608 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5daa-account-delete-7pjkk"] Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.920005 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.920103 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:29.420054183 +0000 UTC m=+1042.082535913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-scripts" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.920377 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.920406 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:29.420398392 +0000 UTC m=+1042.082880122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.921029 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.921130 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:29.42111049 +0000 UTC m=+1042.083592220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-config-data" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.921198 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:28 crc kubenswrapper[4747]: E1128 13:36:28.921251 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:29.421242533 +0000 UTC m=+1042.083724263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.021413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqwk\" (UniqueName: \"kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.021495 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.123241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqwk\" (UniqueName: \"kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.123306 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.124315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.145837 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqwk\" (UniqueName: \"kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk\") pod \"keystone5daa-account-delete-7pjkk\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.182894 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.391567 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5daa-account-delete-7pjkk"] Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429399 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429455 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429467 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:30.429448994 +0000 UTC m=+1043.091930714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429543 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:30.429524196 +0000 UTC m=+1043.092005926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-scripts" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429547 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429586 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429635 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:30.429608458 +0000 UTC m=+1043.092090228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:29 crc kubenswrapper[4747]: E1128 13:36:29.429670 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:30.429653249 +0000 UTC m=+1043.092135119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-config-data" not found Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.654382 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e85120-b1df-47d7-854b-2898eb3b78fb" path="/var/lib/kubelet/pods/37e85120-b1df-47d7-854b-2898eb3b78fb/volumes" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.655419 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3a5419-e275-4c88-96e2-712ada1896b9" path="/var/lib/kubelet/pods/6c3a5419-e275-4c88-96e2-712ada1896b9/volumes" Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.720563 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" event={"ID":"b57511e3-f0bd-4629-b210-abed85a19b84","Type":"ContainerStarted","Data":"bb73d7f701f97da0d1fbc5805f5de66d8e72d185572d04c974d46a495918d76b"} Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.720607 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" event={"ID":"b57511e3-f0bd-4629-b210-abed85a19b84","Type":"ContainerStarted","Data":"e27904f284d816902519333c180ef22480e22e9379e66b4e2479537300e71879"} Nov 28 13:36:29 crc kubenswrapper[4747]: I1128 13:36:29.720844 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" secret="" err="secret \"keystone-keystone-dockercfg-rtwxn\" not found" Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443271 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443337 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443347 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:32.443332572 +0000 UTC m=+1045.105814302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443373 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443283 4747 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443390 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:32.443378903 +0000 UTC m=+1045.105860653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-config-data" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443543 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:32.443509646 +0000 UTC m=+1045.105991406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone" not found Nov 28 13:36:30 crc kubenswrapper[4747]: E1128 13:36:30.443572 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts podName:26279628-22ee-48c6-a0c9-873cdb6e9cf1 nodeName:}" failed. No retries permitted until 2025-11-28 13:36:32.443555818 +0000 UTC m=+1045.106037658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts") pod "keystone-844f8fd76b-mwp8x" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1") : secret "keystone-scripts" not found Nov 28 13:36:30 crc kubenswrapper[4747]: I1128 13:36:30.731802 4747 generic.go:334] "Generic (PLEG): container finished" podID="b57511e3-f0bd-4629-b210-abed85a19b84" containerID="bb73d7f701f97da0d1fbc5805f5de66d8e72d185572d04c974d46a495918d76b" exitCode=0 Nov 28 13:36:30 crc kubenswrapper[4747]: I1128 13:36:30.731879 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" event={"ID":"b57511e3-f0bd-4629-b210-abed85a19b84","Type":"ContainerDied","Data":"bb73d7f701f97da0d1fbc5805f5de66d8e72d185572d04c974d46a495918d76b"} Nov 28 13:36:30 crc kubenswrapper[4747]: I1128 13:36:30.732272 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" podUID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" containerName="keystone-api" containerID="cri-o://1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe" gracePeriod=30 Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.051911 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.222844 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.252772 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnqwk\" (UniqueName: \"kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk\") pod \"b57511e3-f0bd-4629-b210-abed85a19b84\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.252951 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts\") pod \"b57511e3-f0bd-4629-b210-abed85a19b84\" (UID: \"b57511e3-f0bd-4629-b210-abed85a19b84\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.254285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b57511e3-f0bd-4629-b210-abed85a19b84" (UID: "b57511e3-f0bd-4629-b210-abed85a19b84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.261923 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk" (OuterVolumeSpecName: "kube-api-access-xnqwk") pod "b57511e3-f0bd-4629-b210-abed85a19b84" (UID: "b57511e3-f0bd-4629-b210-abed85a19b84"). InnerVolumeSpecName "kube-api-access-xnqwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.354802 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-256tr\" (UniqueName: \"kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr\") pod \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.354893 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys\") pod \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.354950 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys\") pod \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.355637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data\") pod \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.355713 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts\") pod \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\" (UID: \"26279628-22ee-48c6-a0c9-873cdb6e9cf1\") " Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.356406 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57511e3-f0bd-4629-b210-abed85a19b84-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.356548 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnqwk\" (UniqueName: \"kubernetes.io/projected/b57511e3-f0bd-4629-b210-abed85a19b84-kube-api-access-xnqwk\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.358064 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr" (OuterVolumeSpecName: "kube-api-access-256tr") pod "26279628-22ee-48c6-a0c9-873cdb6e9cf1" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1"). InnerVolumeSpecName "kube-api-access-256tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.358534 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "26279628-22ee-48c6-a0c9-873cdb6e9cf1" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.359392 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts" (OuterVolumeSpecName: "scripts") pod "26279628-22ee-48c6-a0c9-873cdb6e9cf1" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.360881 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "26279628-22ee-48c6-a0c9-873cdb6e9cf1" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.389713 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data" (OuterVolumeSpecName: "config-data") pod "26279628-22ee-48c6-a0c9-873cdb6e9cf1" (UID: "26279628-22ee-48c6-a0c9-873cdb6e9cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.458479 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.458534 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.458547 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-256tr\" (UniqueName: \"kubernetes.io/projected/26279628-22ee-48c6-a0c9-873cdb6e9cf1-kube-api-access-256tr\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.458559 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.458568 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26279628-22ee-48c6-a0c9-873cdb6e9cf1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.748051 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.747961 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5daa-account-delete-7pjkk" event={"ID":"b57511e3-f0bd-4629-b210-abed85a19b84","Type":"ContainerDied","Data":"e27904f284d816902519333c180ef22480e22e9379e66b4e2479537300e71879"} Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.748605 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27904f284d816902519333c180ef22480e22e9379e66b4e2479537300e71879" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.750331 4747 generic.go:334] "Generic (PLEG): container finished" podID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" containerID="1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe" exitCode=0 Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.750415 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" event={"ID":"26279628-22ee-48c6-a0c9-873cdb6e9cf1","Type":"ContainerDied","Data":"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe"} Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.750465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" event={"ID":"26279628-22ee-48c6-a0c9-873cdb6e9cf1","Type":"ContainerDied","Data":"2e8252742143738547a161146eb9b8c94e69cb2b6167ade22b1f7b625c0f6806"} Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.750495 4747 scope.go:117] "RemoveContainer" containerID="1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.750533 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-844f8fd76b-mwp8x" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.779440 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.783241 4747 scope.go:117] "RemoveContainer" containerID="1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe" Nov 28 13:36:31 crc kubenswrapper[4747]: E1128 13:36:31.783599 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe\": container with ID starting with 1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe not found: ID does not exist" containerID="1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.783634 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe"} err="failed to get container status \"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe\": rpc error: code = NotFound desc = could not find container \"1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe\": container with ID starting with 1ad251ce592bd0074fda82e3c6da5472c5cdc614d65a60626d7c8433e04b71fe not found: ID does not exist" Nov 28 13:36:31 crc kubenswrapper[4747]: I1128 13:36:31.784075 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-844f8fd76b-mwp8x"] Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.381556 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.474720 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts\") pod \"fb313a7b-10f0-44e0-b947-f9bc23011537\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.474780 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys\") pod \"fb313a7b-10f0-44e0-b947-f9bc23011537\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.474853 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys\") pod \"fb313a7b-10f0-44e0-b947-f9bc23011537\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.474905 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxpn\" (UniqueName: \"kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn\") pod \"fb313a7b-10f0-44e0-b947-f9bc23011537\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.474930 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data\") pod \"fb313a7b-10f0-44e0-b947-f9bc23011537\" (UID: \"fb313a7b-10f0-44e0-b947-f9bc23011537\") " Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.479658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb313a7b-10f0-44e0-b947-f9bc23011537" (UID: "fb313a7b-10f0-44e0-b947-f9bc23011537"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.479710 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fb313a7b-10f0-44e0-b947-f9bc23011537" (UID: "fb313a7b-10f0-44e0-b947-f9bc23011537"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.480328 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn" (OuterVolumeSpecName: "kube-api-access-mtxpn") pod "fb313a7b-10f0-44e0-b947-f9bc23011537" (UID: "fb313a7b-10f0-44e0-b947-f9bc23011537"). InnerVolumeSpecName "kube-api-access-mtxpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.480409 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts" (OuterVolumeSpecName: "scripts") pod "fb313a7b-10f0-44e0-b947-f9bc23011537" (UID: "fb313a7b-10f0-44e0-b947-f9bc23011537"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.496071 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data" (OuterVolumeSpecName: "config-data") pod "fb313a7b-10f0-44e0-b947-f9bc23011537" (UID: "fb313a7b-10f0-44e0-b947-f9bc23011537"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.576471 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.576522 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.576546 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxpn\" (UniqueName: \"kubernetes.io/projected/fb313a7b-10f0-44e0-b947-f9bc23011537-kube-api-access-mtxpn\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.576566 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.576582 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb313a7b-10f0-44e0-b947-f9bc23011537-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.759991 4747 generic.go:334] "Generic (PLEG): container finished" podID="fb313a7b-10f0-44e0-b947-f9bc23011537" containerID="5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded" exitCode=0 Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.760059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" event={"ID":"fb313a7b-10f0-44e0-b947-f9bc23011537","Type":"ContainerDied","Data":"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded"} Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.760103 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.760124 4747 scope.go:117] "RemoveContainer" containerID="5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.760107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv" event={"ID":"fb313a7b-10f0-44e0-b947-f9bc23011537","Type":"ContainerDied","Data":"27f9d7ad179fc8b1a25e6a37e4175a3239cd9cc4c8722bfa44bb0bb11f36fc80"} Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.788537 4747 scope.go:117] "RemoveContainer" containerID="5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded" Nov 28 13:36:32 crc kubenswrapper[4747]: E1128 13:36:32.789021 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded\": container with ID starting with 5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded not found: ID does not exist" containerID="5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.789055 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded"} err="failed to get container status \"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded\": rpc error: code = NotFound desc = could not find container \"5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded\": container with ID starting with 5ed9f239a8a9ed536aa61d180d51b6f1d2b2ef39d4f3512b68b8b9431f027ded not found: ID does not exist" Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.801611 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:36:32 crc kubenswrapper[4747]: I1128 13:36:32.806807 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5ff69df7c6-7pqcv"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.656764 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" path="/var/lib/kubelet/pods/26279628-22ee-48c6-a0c9-873cdb6e9cf1/volumes" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.658438 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb313a7b-10f0-44e0-b947-f9bc23011537" path="/var/lib/kubelet/pods/fb313a7b-10f0-44e0-b947-f9bc23011537/volumes" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.872830 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-lz8sc"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.880717 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-lz8sc"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.904316 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone5daa-account-delete-7pjkk"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.909895 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.914986 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone5daa-account-delete-7pjkk"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.920102 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5daa-account-create-update-g2qvn"] Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.965751 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cj8c7"] Nov 28 13:36:33 crc kubenswrapper[4747]: E1128 13:36:33.966026 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966041 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: E1128 13:36:33.966069 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb313a7b-10f0-44e0-b947-f9bc23011537" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966080 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb313a7b-10f0-44e0-b947-f9bc23011537" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: E1128 13:36:33.966092 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57511e3-f0bd-4629-b210-abed85a19b84" containerName="mariadb-account-delete" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966101 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57511e3-f0bd-4629-b210-abed85a19b84" containerName="mariadb-account-delete" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966278 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="26279628-22ee-48c6-a0c9-873cdb6e9cf1" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966292 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57511e3-f0bd-4629-b210-abed85a19b84" containerName="mariadb-account-delete" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966306 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb313a7b-10f0-44e0-b947-f9bc23011537" containerName="keystone-api" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.966788 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:33 crc kubenswrapper[4747]: I1128 13:36:33.973148 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cj8c7"] Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.069122 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9"] Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.070077 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.071981 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.079400 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9"] Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.095370 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.095433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvppv\" (UniqueName: \"kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.196704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.196840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvppv\" (UniqueName: \"kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.196909 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kdx5\" (UniqueName: \"kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.196991 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.198188 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.213641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvppv\" (UniqueName: \"kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv\") pod \"keystone-db-create-cj8c7\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.285062 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.298136 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.298326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kdx5\" (UniqueName: \"kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.298916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.323151 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kdx5\" (UniqueName: \"kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5\") pod \"keystone-4baa-account-create-update-g9rx9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.384813 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.741289 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cj8c7"] Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.789416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" event={"ID":"bc24e552-b90d-48d9-847c-b448241ce8b1","Type":"ContainerStarted","Data":"6661937ca8c6395e0870742a27c8a17ed15325c948680714c88623b695995763"} Nov 28 13:36:34 crc kubenswrapper[4747]: I1128 13:36:34.797752 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9"] Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.651404 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e" path="/var/lib/kubelet/pods/9d5d50a3-d64c-4e47-afe0-4e1a3b8f8b0e/volumes" Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.652096 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f9a105-84d9-4be5-b9af-b5c2cfd5761e" path="/var/lib/kubelet/pods/a1f9a105-84d9-4be5-b9af-b5c2cfd5761e/volumes" Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.652571 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57511e3-f0bd-4629-b210-abed85a19b84" path="/var/lib/kubelet/pods/b57511e3-f0bd-4629-b210-abed85a19b84/volumes" Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.815168 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc24e552-b90d-48d9-847c-b448241ce8b1" containerID="4c182fc717424510b860acc9ed3e1ca53439e5dec7fc1bd0c32102c4743a7335" exitCode=0 Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.815278 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" event={"ID":"bc24e552-b90d-48d9-847c-b448241ce8b1","Type":"ContainerDied","Data":"4c182fc717424510b860acc9ed3e1ca53439e5dec7fc1bd0c32102c4743a7335"} Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.820757 4747 generic.go:334] "Generic (PLEG): container finished" podID="51c44925-3a02-4b6e-a4c7-1c3911a05ed9" containerID="143ec3f906d36e2da5a9e2f3ff369af37bc1c2858f0fca4ff478c41dfe1e6555" exitCode=0 Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.820876 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" event={"ID":"51c44925-3a02-4b6e-a4c7-1c3911a05ed9","Type":"ContainerDied","Data":"143ec3f906d36e2da5a9e2f3ff369af37bc1c2858f0fca4ff478c41dfe1e6555"} Nov 28 13:36:35 crc kubenswrapper[4747]: I1128 13:36:35.820928 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" event={"ID":"51c44925-3a02-4b6e-a4c7-1c3911a05ed9","Type":"ContainerStarted","Data":"e9bbf292001d5de41360cf8353bce40edb76802b054d2f80505a40e88e9ca66d"} Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.158117 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.162399 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.342848 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts\") pod \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.342925 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts\") pod \"bc24e552-b90d-48d9-847c-b448241ce8b1\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.342958 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvppv\" (UniqueName: \"kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv\") pod \"bc24e552-b90d-48d9-847c-b448241ce8b1\" (UID: \"bc24e552-b90d-48d9-847c-b448241ce8b1\") " Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.342987 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kdx5\" (UniqueName: \"kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5\") pod \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\" (UID: \"51c44925-3a02-4b6e-a4c7-1c3911a05ed9\") " Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.343788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51c44925-3a02-4b6e-a4c7-1c3911a05ed9" (UID: "51c44925-3a02-4b6e-a4c7-1c3911a05ed9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.343921 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc24e552-b90d-48d9-847c-b448241ce8b1" (UID: "bc24e552-b90d-48d9-847c-b448241ce8b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.344390 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.344425 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc24e552-b90d-48d9-847c-b448241ce8b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.348953 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv" (OuterVolumeSpecName: "kube-api-access-cvppv") pod "bc24e552-b90d-48d9-847c-b448241ce8b1" (UID: "bc24e552-b90d-48d9-847c-b448241ce8b1"). InnerVolumeSpecName "kube-api-access-cvppv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.349154 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5" (OuterVolumeSpecName: "kube-api-access-4kdx5") pod "51c44925-3a02-4b6e-a4c7-1c3911a05ed9" (UID: "51c44925-3a02-4b6e-a4c7-1c3911a05ed9"). InnerVolumeSpecName "kube-api-access-4kdx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.445555 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvppv\" (UniqueName: \"kubernetes.io/projected/bc24e552-b90d-48d9-847c-b448241ce8b1-kube-api-access-cvppv\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.445619 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kdx5\" (UniqueName: \"kubernetes.io/projected/51c44925-3a02-4b6e-a4c7-1c3911a05ed9-kube-api-access-4kdx5\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.839175 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" event={"ID":"51c44925-3a02-4b6e-a4c7-1c3911a05ed9","Type":"ContainerDied","Data":"e9bbf292001d5de41360cf8353bce40edb76802b054d2f80505a40e88e9ca66d"} Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.839248 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9bbf292001d5de41360cf8353bce40edb76802b054d2f80505a40e88e9ca66d" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.839200 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.840668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" event={"ID":"bc24e552-b90d-48d9-847c-b448241ce8b1","Type":"ContainerDied","Data":"6661937ca8c6395e0870742a27c8a17ed15325c948680714c88623b695995763"} Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.840707 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cj8c7" Nov 28 13:36:37 crc kubenswrapper[4747]: I1128 13:36:37.840711 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6661937ca8c6395e0870742a27c8a17ed15325c948680714c88623b695995763" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.677825 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rjqxg"] Nov 28 13:36:39 crc kubenswrapper[4747]: E1128 13:36:39.678835 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c44925-3a02-4b6e-a4c7-1c3911a05ed9" containerName="mariadb-account-create-update" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.678858 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c44925-3a02-4b6e-a4c7-1c3911a05ed9" containerName="mariadb-account-create-update" Nov 28 13:36:39 crc kubenswrapper[4747]: E1128 13:36:39.678901 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc24e552-b90d-48d9-847c-b448241ce8b1" containerName="mariadb-database-create" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.678913 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc24e552-b90d-48d9-847c-b448241ce8b1" containerName="mariadb-database-create" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.679195 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc24e552-b90d-48d9-847c-b448241ce8b1" containerName="mariadb-database-create" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.679239 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c44925-3a02-4b6e-a4c7-1c3911a05ed9" containerName="mariadb-account-create-update" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.680442 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.684479 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.684602 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hjrcj" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.684679 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.691412 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rjqxg"] Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.697269 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.877692 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdmn\" (UniqueName: \"kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.877921 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.979186 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdmn\" (UniqueName: \"kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.979298 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.988269 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:39 crc kubenswrapper[4747]: I1128 13:36:39.996048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdmn\" (UniqueName: \"kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn\") pod \"keystone-db-sync-rjqxg\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:40 crc kubenswrapper[4747]: I1128 13:36:40.005932 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:40 crc kubenswrapper[4747]: I1128 13:36:40.291041 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rjqxg"] Nov 28 13:36:40 crc kubenswrapper[4747]: I1128 13:36:40.862812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" event={"ID":"55e65fc1-ca46-46b3-b83f-5e66982738bc","Type":"ContainerStarted","Data":"97625c7c601ccb0cf10013e0335070d5fcaeffa5eb21404929c581232c42eb22"} Nov 28 13:36:40 crc kubenswrapper[4747]: I1128 13:36:40.863178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" event={"ID":"55e65fc1-ca46-46b3-b83f-5e66982738bc","Type":"ContainerStarted","Data":"38270bde3c9cdb268e3e645d618551e3d63949ea95998308ea30a8f423318de6"} Nov 28 13:36:40 crc kubenswrapper[4747]: I1128 13:36:40.889239 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" podStartSLOduration=1.88918619 podStartE2EDuration="1.88918619s" podCreationTimestamp="2025-11-28 13:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:40.887409075 +0000 UTC m=+1053.549890845" watchObservedRunningTime="2025-11-28 13:36:40.88918619 +0000 UTC m=+1053.551667930" Nov 28 13:36:42 crc kubenswrapper[4747]: I1128 13:36:42.877600 4747 generic.go:334] "Generic (PLEG): container finished" podID="55e65fc1-ca46-46b3-b83f-5e66982738bc" containerID="97625c7c601ccb0cf10013e0335070d5fcaeffa5eb21404929c581232c42eb22" exitCode=0 Nov 28 13:36:42 crc kubenswrapper[4747]: I1128 13:36:42.877688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" event={"ID":"55e65fc1-ca46-46b3-b83f-5e66982738bc","Type":"ContainerDied","Data":"97625c7c601ccb0cf10013e0335070d5fcaeffa5eb21404929c581232c42eb22"} Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.215264 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.253551 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdmn\" (UniqueName: \"kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn\") pod \"55e65fc1-ca46-46b3-b83f-5e66982738bc\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.253959 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data\") pod \"55e65fc1-ca46-46b3-b83f-5e66982738bc\" (UID: \"55e65fc1-ca46-46b3-b83f-5e66982738bc\") " Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.261016 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn" (OuterVolumeSpecName: "kube-api-access-xkdmn") pod "55e65fc1-ca46-46b3-b83f-5e66982738bc" (UID: "55e65fc1-ca46-46b3-b83f-5e66982738bc"). InnerVolumeSpecName "kube-api-access-xkdmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.294848 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data" (OuterVolumeSpecName: "config-data") pod "55e65fc1-ca46-46b3-b83f-5e66982738bc" (UID: "55e65fc1-ca46-46b3-b83f-5e66982738bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.355355 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e65fc1-ca46-46b3-b83f-5e66982738bc-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.355415 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkdmn\" (UniqueName: \"kubernetes.io/projected/55e65fc1-ca46-46b3-b83f-5e66982738bc-kube-api-access-xkdmn\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.893701 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" event={"ID":"55e65fc1-ca46-46b3-b83f-5e66982738bc","Type":"ContainerDied","Data":"38270bde3c9cdb268e3e645d618551e3d63949ea95998308ea30a8f423318de6"} Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.893747 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rjqxg" Nov 28 13:36:44 crc kubenswrapper[4747]: I1128 13:36:44.893759 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38270bde3c9cdb268e3e645d618551e3d63949ea95998308ea30a8f423318de6" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.081655 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tgmws"] Nov 28 13:36:45 crc kubenswrapper[4747]: E1128 13:36:45.081895 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e65fc1-ca46-46b3-b83f-5e66982738bc" containerName="keystone-db-sync" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.081905 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e65fc1-ca46-46b3-b83f-5e66982738bc" containerName="keystone-db-sync" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.082027 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e65fc1-ca46-46b3-b83f-5e66982738bc" containerName="keystone-db-sync" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.082523 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.088249 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.088244 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.088529 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hjrcj" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.088632 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.088706 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.096420 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tgmws"] Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.169559 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.169666 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.169804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qjm\" (UniqueName: \"kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.169882 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.169951 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.272103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qjm\" (UniqueName: \"kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.272196 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.272249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.272272 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.272302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.275945 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.278546 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.278683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.279951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.292670 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qjm\" (UniqueName: \"kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm\") pod \"keystone-bootstrap-tgmws\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.400251 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.851998 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tgmws"] Nov 28 13:36:45 crc kubenswrapper[4747]: I1128 13:36:45.900468 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" event={"ID":"00d85152-934b-47f1-9413-f0e589baac15","Type":"ContainerStarted","Data":"89713b4171f7c2d5b6845182631a94624cf9db3b12fb6b1a880e61a7b52f185e"} Nov 28 13:36:46 crc kubenswrapper[4747]: I1128 13:36:46.907952 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" event={"ID":"00d85152-934b-47f1-9413-f0e589baac15","Type":"ContainerStarted","Data":"4f6b4da1b30dc65e0c06a68e6fd79c7e645e2ea40b24fdcad0562b7aa5c88554"} Nov 28 13:36:46 crc kubenswrapper[4747]: I1128 13:36:46.930611 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" podStartSLOduration=1.9305904 podStartE2EDuration="1.9305904s" podCreationTimestamp="2025-11-28 13:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:46.929240446 +0000 UTC m=+1059.591722176" watchObservedRunningTime="2025-11-28 13:36:46.9305904 +0000 UTC m=+1059.593072130" Nov 28 13:36:47 crc kubenswrapper[4747]: I1128 13:36:47.633250 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:36:47 crc kubenswrapper[4747]: I1128 13:36:47.633326 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:36:48 crc kubenswrapper[4747]: I1128 13:36:48.923822 4747 generic.go:334] "Generic (PLEG): container finished" podID="00d85152-934b-47f1-9413-f0e589baac15" containerID="4f6b4da1b30dc65e0c06a68e6fd79c7e645e2ea40b24fdcad0562b7aa5c88554" exitCode=0 Nov 28 13:36:48 crc kubenswrapper[4747]: I1128 13:36:48.923868 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" event={"ID":"00d85152-934b-47f1-9413-f0e589baac15","Type":"ContainerDied","Data":"4f6b4da1b30dc65e0c06a68e6fd79c7e645e2ea40b24fdcad0562b7aa5c88554"} Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.219901 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.343078 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys\") pod \"00d85152-934b-47f1-9413-f0e589baac15\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.343164 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qjm\" (UniqueName: \"kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm\") pod \"00d85152-934b-47f1-9413-f0e589baac15\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.343196 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts\") pod \"00d85152-934b-47f1-9413-f0e589baac15\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.343223 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data\") pod \"00d85152-934b-47f1-9413-f0e589baac15\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.343262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys\") pod \"00d85152-934b-47f1-9413-f0e589baac15\" (UID: \"00d85152-934b-47f1-9413-f0e589baac15\") " Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.349331 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts" (OuterVolumeSpecName: "scripts") pod "00d85152-934b-47f1-9413-f0e589baac15" (UID: "00d85152-934b-47f1-9413-f0e589baac15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.349977 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "00d85152-934b-47f1-9413-f0e589baac15" (UID: "00d85152-934b-47f1-9413-f0e589baac15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.353462 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "00d85152-934b-47f1-9413-f0e589baac15" (UID: "00d85152-934b-47f1-9413-f0e589baac15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.358198 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm" (OuterVolumeSpecName: "kube-api-access-m5qjm") pod "00d85152-934b-47f1-9413-f0e589baac15" (UID: "00d85152-934b-47f1-9413-f0e589baac15"). InnerVolumeSpecName "kube-api-access-m5qjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.374380 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data" (OuterVolumeSpecName: "config-data") pod "00d85152-934b-47f1-9413-f0e589baac15" (UID: "00d85152-934b-47f1-9413-f0e589baac15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.444596 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.444629 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.444639 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.444647 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00d85152-934b-47f1-9413-f0e589baac15-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.444656 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qjm\" (UniqueName: \"kubernetes.io/projected/00d85152-934b-47f1-9413-f0e589baac15-kube-api-access-m5qjm\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.938032 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" event={"ID":"00d85152-934b-47f1-9413-f0e589baac15","Type":"ContainerDied","Data":"89713b4171f7c2d5b6845182631a94624cf9db3b12fb6b1a880e61a7b52f185e"} Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.938577 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89713b4171f7c2d5b6845182631a94624cf9db3b12fb6b1a880e61a7b52f185e" Nov 28 13:36:50 crc kubenswrapper[4747]: I1128 13:36:50.938309 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tgmws" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.144706 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:36:51 crc kubenswrapper[4747]: E1128 13:36:51.145022 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d85152-934b-47f1-9413-f0e589baac15" containerName="keystone-bootstrap" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.145041 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d85152-934b-47f1-9413-f0e589baac15" containerName="keystone-bootstrap" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.145228 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d85152-934b-47f1-9413-f0e589baac15" containerName="keystone-bootstrap" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.145714 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.149550 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.150231 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.150829 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.150942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hjrcj" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.163872 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.255952 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.256037 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnh7p\" (UniqueName: \"kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.256367 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.256447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.256561 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.357886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.357959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnh7p\" (UniqueName: \"kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.358020 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.358050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.358094 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.362029 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.362629 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.362681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.362964 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.376032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnh7p\" (UniqueName: \"kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p\") pod \"keystone-5465cd874c-kj49m\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.463824 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.905501 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:36:51 crc kubenswrapper[4747]: I1128 13:36:51.954575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" event={"ID":"c59e50b1-bd44-4e42-bc34-5acefee2e0a0","Type":"ContainerStarted","Data":"a2bc21c0232ac4da4f92772263be6e1d5955da0473b1d11af5fbc588259df310"} Nov 28 13:36:52 crc kubenswrapper[4747]: I1128 13:36:52.964628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" event={"ID":"c59e50b1-bd44-4e42-bc34-5acefee2e0a0","Type":"ContainerStarted","Data":"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693"} Nov 28 13:36:52 crc kubenswrapper[4747]: I1128 13:36:52.965343 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:36:52 crc kubenswrapper[4747]: I1128 13:36:52.988810 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" podStartSLOduration=1.988789843 podStartE2EDuration="1.988789843s" podCreationTimestamp="2025-11-28 13:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:52.985471599 +0000 UTC m=+1065.647953339" watchObservedRunningTime="2025-11-28 13:36:52.988789843 +0000 UTC m=+1065.651271583" Nov 28 13:37:17 crc kubenswrapper[4747]: I1128 13:37:17.632766 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:37:17 crc kubenswrapper[4747]: I1128 13:37:17.633651 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:37:22 crc kubenswrapper[4747]: I1128 13:37:22.874621 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.134846 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.136151 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.139744 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.140525 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.160341 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.203686 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299320 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx9g2\" (UniqueName: \"kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299409 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299436 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299520 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299611 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299645 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299829 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpdb\" (UniqueName: \"kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.299888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.300034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.401993 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402095 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx9g2\" (UniqueName: \"kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402126 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402230 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402303 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpdb\" (UniqueName: \"kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.402321 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.407841 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.407879 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.408271 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.408594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.410531 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.410728 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.411480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.412332 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.434122 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx9g2\" (UniqueName: \"kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2\") pod \"keystone-5465cd874c-27m2p\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.436523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpdb\" (UniqueName: \"kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb\") pod \"keystone-5465cd874c-xxdpm\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.458660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.461633 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.692867 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:37:24 crc kubenswrapper[4747]: I1128 13:37:24.939842 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:37:25 crc kubenswrapper[4747]: I1128 13:37:25.250945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" event={"ID":"54a9d1f5-ad51-40b3-9f18-bc662301a1a1","Type":"ContainerStarted","Data":"be82f605f88295a1e88c83bb4547ca021755085017c7ede67d0ce65e9814a7dc"} Nov 28 13:37:25 crc kubenswrapper[4747]: I1128 13:37:25.253380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" event={"ID":"b9fd4e9d-888f-42d4-85ee-af5d42776858","Type":"ContainerStarted","Data":"81043093d6f24cbe3fdb44b9d4ebca63776cf294ae9d148b33f6be25e656d06d"} Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.263745 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" event={"ID":"b9fd4e9d-888f-42d4-85ee-af5d42776858","Type":"ContainerStarted","Data":"6ea01fe44796df136413e2efb5bdd86f1f897db80e61c6e40ea145f51e413fb0"} Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.264136 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.267787 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" event={"ID":"54a9d1f5-ad51-40b3-9f18-bc662301a1a1","Type":"ContainerStarted","Data":"f880fa542499c4eed093cb3073c0ddfd46fea48042a6234b8e0edab0f4602f61"} Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.268057 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.293908 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" podStartSLOduration=2.2938866620000002 podStartE2EDuration="2.293886662s" podCreationTimestamp="2025-11-28 13:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:37:26.287023928 +0000 UTC m=+1098.949505688" watchObservedRunningTime="2025-11-28 13:37:26.293886662 +0000 UTC m=+1098.956368402" Nov 28 13:37:26 crc kubenswrapper[4747]: I1128 13:37:26.312911 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" podStartSLOduration=2.312891944 podStartE2EDuration="2.312891944s" podCreationTimestamp="2025-11-28 13:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:37:26.307725483 +0000 UTC m=+1098.970207253" watchObservedRunningTime="2025-11-28 13:37:26.312891944 +0000 UTC m=+1098.975373684" Nov 28 13:37:47 crc kubenswrapper[4747]: I1128 13:37:47.632812 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:37:47 crc kubenswrapper[4747]: I1128 13:37:47.633510 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:37:47 crc kubenswrapper[4747]: I1128 13:37:47.633579 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:37:47 crc kubenswrapper[4747]: I1128 13:37:47.634519 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:37:47 crc kubenswrapper[4747]: I1128 13:37:47.634621 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5" gracePeriod=600 Nov 28 13:37:48 crc kubenswrapper[4747]: I1128 13:37:48.437025 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5" exitCode=0 Nov 28 13:37:48 crc kubenswrapper[4747]: I1128 13:37:48.437096 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5"} Nov 28 13:37:48 crc kubenswrapper[4747]: I1128 13:37:48.437605 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c"} Nov 28 13:37:48 crc kubenswrapper[4747]: I1128 13:37:48.437625 4747 scope.go:117] "RemoveContainer" containerID="b7faf1b409a382c4ed714300a1dd00c81a6791b386fd5f862cfc6c604d1093bb" Nov 28 13:37:56 crc kubenswrapper[4747]: I1128 13:37:56.259695 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:37:56 crc kubenswrapper[4747]: I1128 13:37:56.261818 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:37:57 crc kubenswrapper[4747]: I1128 13:37:57.135140 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:37:57 crc kubenswrapper[4747]: I1128 13:37:57.135454 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" podUID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" containerName="keystone-api" containerID="cri-o://f880fa542499c4eed093cb3073c0ddfd46fea48042a6234b8e0edab0f4602f61" gracePeriod=30 Nov 28 13:37:57 crc kubenswrapper[4747]: I1128 13:37:57.139184 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:37:57 crc kubenswrapper[4747]: I1128 13:37:57.139416 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" podUID="b9fd4e9d-888f-42d4-85ee-af5d42776858" containerName="keystone-api" containerID="cri-o://6ea01fe44796df136413e2efb5bdd86f1f897db80e61c6e40ea145f51e413fb0" gracePeriod=30 Nov 28 13:37:58 crc kubenswrapper[4747]: I1128 13:37:58.325051 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:37:58 crc kubenswrapper[4747]: I1128 13:37:58.325520 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" podUID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" containerName="keystone-api" containerID="cri-o://2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693" gracePeriod=30 Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.535591 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9fd4e9d-888f-42d4-85ee-af5d42776858" containerID="6ea01fe44796df136413e2efb5bdd86f1f897db80e61c6e40ea145f51e413fb0" exitCode=0 Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.535688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" event={"ID":"b9fd4e9d-888f-42d4-85ee-af5d42776858","Type":"ContainerDied","Data":"6ea01fe44796df136413e2efb5bdd86f1f897db80e61c6e40ea145f51e413fb0"} Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.537712 4747 generic.go:334] "Generic (PLEG): container finished" podID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" containerID="f880fa542499c4eed093cb3073c0ddfd46fea48042a6234b8e0edab0f4602f61" exitCode=0 Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.537749 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" event={"ID":"54a9d1f5-ad51-40b3-9f18-bc662301a1a1","Type":"ContainerDied","Data":"f880fa542499c4eed093cb3073c0ddfd46fea48042a6234b8e0edab0f4602f61"} Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.612103 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.617080 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.673943 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnpdb\" (UniqueName: \"kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb\") pod \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674068 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data\") pod \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674125 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys\") pod \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674157 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts\") pod \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674355 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys\") pod \"b9fd4e9d-888f-42d4-85ee-af5d42776858\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data\") pod \"b9fd4e9d-888f-42d4-85ee-af5d42776858\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674624 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts\") pod \"b9fd4e9d-888f-42d4-85ee-af5d42776858\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674646 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys\") pod \"b9fd4e9d-888f-42d4-85ee-af5d42776858\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx9g2\" (UniqueName: \"kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2\") pod \"b9fd4e9d-888f-42d4-85ee-af5d42776858\" (UID: \"b9fd4e9d-888f-42d4-85ee-af5d42776858\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.674751 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys\") pod \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\" (UID: \"54a9d1f5-ad51-40b3-9f18-bc662301a1a1\") " Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.682848 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b9fd4e9d-888f-42d4-85ee-af5d42776858" (UID: "b9fd4e9d-888f-42d4-85ee-af5d42776858"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.682905 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "54a9d1f5-ad51-40b3-9f18-bc662301a1a1" (UID: "54a9d1f5-ad51-40b3-9f18-bc662301a1a1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.682972 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts" (OuterVolumeSpecName: "scripts") pod "b9fd4e9d-888f-42d4-85ee-af5d42776858" (UID: "b9fd4e9d-888f-42d4-85ee-af5d42776858"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.683003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b9fd4e9d-888f-42d4-85ee-af5d42776858" (UID: "b9fd4e9d-888f-42d4-85ee-af5d42776858"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.683026 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts" (OuterVolumeSpecName: "scripts") pod "54a9d1f5-ad51-40b3-9f18-bc662301a1a1" (UID: "54a9d1f5-ad51-40b3-9f18-bc662301a1a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.683824 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb" (OuterVolumeSpecName: "kube-api-access-xnpdb") pod "54a9d1f5-ad51-40b3-9f18-bc662301a1a1" (UID: "54a9d1f5-ad51-40b3-9f18-bc662301a1a1"). InnerVolumeSpecName "kube-api-access-xnpdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.685697 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "54a9d1f5-ad51-40b3-9f18-bc662301a1a1" (UID: "54a9d1f5-ad51-40b3-9f18-bc662301a1a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.692314 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2" (OuterVolumeSpecName: "kube-api-access-wx9g2") pod "b9fd4e9d-888f-42d4-85ee-af5d42776858" (UID: "b9fd4e9d-888f-42d4-85ee-af5d42776858"). InnerVolumeSpecName "kube-api-access-wx9g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.701510 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data" (OuterVolumeSpecName: "config-data") pod "54a9d1f5-ad51-40b3-9f18-bc662301a1a1" (UID: "54a9d1f5-ad51-40b3-9f18-bc662301a1a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.701609 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data" (OuterVolumeSpecName: "config-data") pod "b9fd4e9d-888f-42d4-85ee-af5d42776858" (UID: "b9fd4e9d-888f-42d4-85ee-af5d42776858"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779331 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779394 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779404 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779413 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779424 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779433 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9fd4e9d-888f-42d4-85ee-af5d42776858-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779441 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx9g2\" (UniqueName: \"kubernetes.io/projected/b9fd4e9d-888f-42d4-85ee-af5d42776858-kube-api-access-wx9g2\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779454 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779464 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnpdb\" (UniqueName: \"kubernetes.io/projected/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-kube-api-access-xnpdb\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:00 crc kubenswrapper[4747]: I1128 13:38:00.779477 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a9d1f5-ad51-40b3-9f18-bc662301a1a1-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.550865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" event={"ID":"b9fd4e9d-888f-42d4-85ee-af5d42776858","Type":"ContainerDied","Data":"81043093d6f24cbe3fdb44b9d4ebca63776cf294ae9d148b33f6be25e656d06d"} Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.550984 4747 scope.go:117] "RemoveContainer" containerID="6ea01fe44796df136413e2efb5bdd86f1f897db80e61c6e40ea145f51e413fb0" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.550902 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-27m2p" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.554804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" event={"ID":"54a9d1f5-ad51-40b3-9f18-bc662301a1a1","Type":"ContainerDied","Data":"be82f605f88295a1e88c83bb4547ca021755085017c7ede67d0ce65e9814a7dc"} Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.554877 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-xxdpm" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.593969 4747 scope.go:117] "RemoveContainer" containerID="f880fa542499c4eed093cb3073c0ddfd46fea48042a6234b8e0edab0f4602f61" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.607948 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.619830 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-27m2p"] Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.650261 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9fd4e9d-888f-42d4-85ee-af5d42776858" path="/var/lib/kubelet/pods/b9fd4e9d-888f-42d4-85ee-af5d42776858/volumes" Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.650800 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:38:01 crc kubenswrapper[4747]: I1128 13:38:01.654237 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-xxdpm"] Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.459919 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.565322 4747 generic.go:334] "Generic (PLEG): container finished" podID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" containerID="2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693" exitCode=0 Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.565369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" event={"ID":"c59e50b1-bd44-4e42-bc34-5acefee2e0a0","Type":"ContainerDied","Data":"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693"} Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.565380 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.565396 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5465cd874c-kj49m" event={"ID":"c59e50b1-bd44-4e42-bc34-5acefee2e0a0","Type":"ContainerDied","Data":"a2bc21c0232ac4da4f92772263be6e1d5955da0473b1d11af5fbc588259df310"} Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.565417 4747 scope.go:117] "RemoveContainer" containerID="2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.586475 4747 scope.go:117] "RemoveContainer" containerID="2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693" Nov 28 13:38:02 crc kubenswrapper[4747]: E1128 13:38:02.587545 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693\": container with ID starting with 2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693 not found: ID does not exist" containerID="2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.587611 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693"} err="failed to get container status \"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693\": rpc error: code = NotFound desc = could not find container \"2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693\": container with ID starting with 2816837929f46dbb9ad4b1f6b02f9445e5f9f7efe5002d3a3e05683e6d512693 not found: ID does not exist" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.608881 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnh7p\" (UniqueName: \"kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p\") pod \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.608931 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts\") pod \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.608961 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data\") pod \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.608988 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys\") pod \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.609121 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys\") pod \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\" (UID: \"c59e50b1-bd44-4e42-bc34-5acefee2e0a0\") " Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.613717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p" (OuterVolumeSpecName: "kube-api-access-jnh7p") pod "c59e50b1-bd44-4e42-bc34-5acefee2e0a0" (UID: "c59e50b1-bd44-4e42-bc34-5acefee2e0a0"). InnerVolumeSpecName "kube-api-access-jnh7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.613780 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c59e50b1-bd44-4e42-bc34-5acefee2e0a0" (UID: "c59e50b1-bd44-4e42-bc34-5acefee2e0a0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.613809 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c59e50b1-bd44-4e42-bc34-5acefee2e0a0" (UID: "c59e50b1-bd44-4e42-bc34-5acefee2e0a0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.614846 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts" (OuterVolumeSpecName: "scripts") pod "c59e50b1-bd44-4e42-bc34-5acefee2e0a0" (UID: "c59e50b1-bd44-4e42-bc34-5acefee2e0a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.638383 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data" (OuterVolumeSpecName: "config-data") pod "c59e50b1-bd44-4e42-bc34-5acefee2e0a0" (UID: "c59e50b1-bd44-4e42-bc34-5acefee2e0a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.711774 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.711818 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnh7p\" (UniqueName: \"kubernetes.io/projected/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-kube-api-access-jnh7p\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.711834 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.711847 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.711861 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59e50b1-bd44-4e42-bc34-5acefee2e0a0-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.897563 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:38:02 crc kubenswrapper[4747]: I1128 13:38:02.902422 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5465cd874c-kj49m"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.469776 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rjqxg"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.477919 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rjqxg"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.489344 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tgmws"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.496172 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tgmws"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556140 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone4baa-account-delete-hskbt"] Nov 28 13:38:03 crc kubenswrapper[4747]: E1128 13:38:03.556499 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fd4e9d-888f-42d4-85ee-af5d42776858" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556523 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fd4e9d-888f-42d4-85ee-af5d42776858" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: E1128 13:38:03.556545 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556553 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: E1128 13:38:03.556573 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556582 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556730 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9fd4e9d-888f-42d4-85ee-af5d42776858" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556754 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.556765 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" containerName="keystone-api" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.557321 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.571180 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone4baa-account-delete-hskbt"] Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.623746 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27hj\" (UniqueName: \"kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.623825 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.648119 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d85152-934b-47f1-9413-f0e589baac15" path="/var/lib/kubelet/pods/00d85152-934b-47f1-9413-f0e589baac15/volumes" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.648718 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a9d1f5-ad51-40b3-9f18-bc662301a1a1" path="/var/lib/kubelet/pods/54a9d1f5-ad51-40b3-9f18-bc662301a1a1/volumes" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.649246 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e65fc1-ca46-46b3-b83f-5e66982738bc" path="/var/lib/kubelet/pods/55e65fc1-ca46-46b3-b83f-5e66982738bc/volumes" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.649811 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59e50b1-bd44-4e42-bc34-5acefee2e0a0" path="/var/lib/kubelet/pods/c59e50b1-bd44-4e42-bc34-5acefee2e0a0/volumes" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.725769 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.725875 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q27hj\" (UniqueName: \"kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.726599 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.740945 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27hj\" (UniqueName: \"kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj\") pod \"keystone4baa-account-delete-hskbt\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:03 crc kubenswrapper[4747]: I1128 13:38:03.874549 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:04 crc kubenswrapper[4747]: I1128 13:38:04.141920 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone4baa-account-delete-hskbt"] Nov 28 13:38:04 crc kubenswrapper[4747]: I1128 13:38:04.608574 4747 generic.go:334] "Generic (PLEG): container finished" podID="0ed200bd-1730-465d-ace4-62809c91b056" containerID="034f212cdf3ff034e2095e8361f3d47f3fab0be3d8a17ebf7ce37868d8a18e8f" exitCode=0 Nov 28 13:38:04 crc kubenswrapper[4747]: I1128 13:38:04.608726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" event={"ID":"0ed200bd-1730-465d-ace4-62809c91b056","Type":"ContainerDied","Data":"034f212cdf3ff034e2095e8361f3d47f3fab0be3d8a17ebf7ce37868d8a18e8f"} Nov 28 13:38:04 crc kubenswrapper[4747]: I1128 13:38:04.609022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" event={"ID":"0ed200bd-1730-465d-ace4-62809c91b056","Type":"ContainerStarted","Data":"186ba75c857be0cbb4d3a8f8f9357fa92b86029447511df6ad7bb109c3056809"} Nov 28 13:38:05 crc kubenswrapper[4747]: I1128 13:38:05.927608 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:05 crc kubenswrapper[4747]: I1128 13:38:05.961599 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts\") pod \"0ed200bd-1730-465d-ace4-62809c91b056\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " Nov 28 13:38:05 crc kubenswrapper[4747]: I1128 13:38:05.961740 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q27hj\" (UniqueName: \"kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj\") pod \"0ed200bd-1730-465d-ace4-62809c91b056\" (UID: \"0ed200bd-1730-465d-ace4-62809c91b056\") " Nov 28 13:38:05 crc kubenswrapper[4747]: I1128 13:38:05.962611 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ed200bd-1730-465d-ace4-62809c91b056" (UID: "0ed200bd-1730-465d-ace4-62809c91b056"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:38:05 crc kubenswrapper[4747]: I1128 13:38:05.972385 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj" (OuterVolumeSpecName: "kube-api-access-q27hj") pod "0ed200bd-1730-465d-ace4-62809c91b056" (UID: "0ed200bd-1730-465d-ace4-62809c91b056"). InnerVolumeSpecName "kube-api-access-q27hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:06 crc kubenswrapper[4747]: I1128 13:38:06.062666 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q27hj\" (UniqueName: \"kubernetes.io/projected/0ed200bd-1730-465d-ace4-62809c91b056-kube-api-access-q27hj\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:06 crc kubenswrapper[4747]: I1128 13:38:06.062693 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ed200bd-1730-465d-ace4-62809c91b056-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:06 crc kubenswrapper[4747]: I1128 13:38:06.634553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" event={"ID":"0ed200bd-1730-465d-ace4-62809c91b056","Type":"ContainerDied","Data":"186ba75c857be0cbb4d3a8f8f9357fa92b86029447511df6ad7bb109c3056809"} Nov 28 13:38:06 crc kubenswrapper[4747]: I1128 13:38:06.634782 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186ba75c857be0cbb4d3a8f8f9357fa92b86029447511df6ad7bb109c3056809" Nov 28 13:38:06 crc kubenswrapper[4747]: I1128 13:38:06.634852 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone4baa-account-delete-hskbt" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.571070 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cj8c7"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.576401 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cj8c7"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.587044 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.594014 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone4baa-account-delete-hskbt"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.598521 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-4baa-account-create-update-g9rx9"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.602323 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone4baa-account-delete-hskbt"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.662421 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pgpzb"] Nov 28 13:38:08 crc kubenswrapper[4747]: E1128 13:38:08.662727 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed200bd-1730-465d-ace4-62809c91b056" containerName="mariadb-account-delete" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.662748 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed200bd-1730-465d-ace4-62809c91b056" containerName="mariadb-account-delete" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.662899 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed200bd-1730-465d-ace4-62809c91b056" containerName="mariadb-account-delete" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.663432 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.675660 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pgpzb"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.701838 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvv6w\" (UniqueName: \"kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.701912 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.770715 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.771643 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.774620 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.777161 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj"] Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.803791 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.803863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvv6w\" (UniqueName: \"kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.803921 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.803975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222ld\" (UniqueName: \"kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.804950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.828107 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvv6w\" (UniqueName: \"kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w\") pod \"keystone-db-create-pgpzb\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.905735 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-222ld\" (UniqueName: \"kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.905811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.906527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.922839 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-222ld\" (UniqueName: \"kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld\") pod \"keystone-3f76-account-create-update-jzdrj\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:08 crc kubenswrapper[4747]: I1128 13:38:08.984561 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.088763 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.289542 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj"] Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.440954 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pgpzb"] Nov 28 13:38:09 crc kubenswrapper[4747]: W1128 13:38:09.444750 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece27526_fcae_4ff8_842a_1bebdcb1aec2.slice/crio-e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d WatchSource:0}: Error finding container e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d: Status 404 returned error can't find the container with id e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.654268 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed200bd-1730-465d-ace4-62809c91b056" path="/var/lib/kubelet/pods/0ed200bd-1730-465d-ace4-62809c91b056/volumes" Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.654870 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c44925-3a02-4b6e-a4c7-1c3911a05ed9" path="/var/lib/kubelet/pods/51c44925-3a02-4b6e-a4c7-1c3911a05ed9/volumes" Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.655339 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc24e552-b90d-48d9-847c-b448241ce8b1" path="/var/lib/kubelet/pods/bc24e552-b90d-48d9-847c-b448241ce8b1/volumes" Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.660362 4747 generic.go:334] "Generic (PLEG): container finished" podID="29c2a7cb-738b-4d5e-a504-31e78722389d" containerID="9fb0be8d8b4b630123ebf77bca36eaac92269b4d4dac5521ffbbbfb890abf4c4" exitCode=0 Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.660417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" event={"ID":"29c2a7cb-738b-4d5e-a504-31e78722389d","Type":"ContainerDied","Data":"9fb0be8d8b4b630123ebf77bca36eaac92269b4d4dac5521ffbbbfb890abf4c4"} Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.660441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" event={"ID":"29c2a7cb-738b-4d5e-a504-31e78722389d","Type":"ContainerStarted","Data":"61008b340c53ce36b76967dae12fcd1ace6d45ff8c797e15a4411ed7317f2755"} Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.662169 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" event={"ID":"ece27526-fcae-4ff8-842a-1bebdcb1aec2","Type":"ContainerStarted","Data":"61d53147d8b414e017202486f565ba2add243fe3d10553a1a6f1ded91ec5dd3d"} Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.662238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" event={"ID":"ece27526-fcae-4ff8-842a-1bebdcb1aec2","Type":"ContainerStarted","Data":"e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d"} Nov 28 13:38:09 crc kubenswrapper[4747]: I1128 13:38:09.691419 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" podStartSLOduration=1.691398279 podStartE2EDuration="1.691398279s" podCreationTimestamp="2025-11-28 13:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:38:09.689984663 +0000 UTC m=+1142.352466393" watchObservedRunningTime="2025-11-28 13:38:09.691398279 +0000 UTC m=+1142.353880009" Nov 28 13:38:10 crc kubenswrapper[4747]: I1128 13:38:10.670434 4747 generic.go:334] "Generic (PLEG): container finished" podID="ece27526-fcae-4ff8-842a-1bebdcb1aec2" containerID="61d53147d8b414e017202486f565ba2add243fe3d10553a1a6f1ded91ec5dd3d" exitCode=0 Nov 28 13:38:10 crc kubenswrapper[4747]: I1128 13:38:10.670539 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" event={"ID":"ece27526-fcae-4ff8-842a-1bebdcb1aec2","Type":"ContainerDied","Data":"61d53147d8b414e017202486f565ba2add243fe3d10553a1a6f1ded91ec5dd3d"} Nov 28 13:38:10 crc kubenswrapper[4747]: I1128 13:38:10.976815 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.036735 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts\") pod \"29c2a7cb-738b-4d5e-a504-31e78722389d\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.036860 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-222ld\" (UniqueName: \"kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld\") pod \"29c2a7cb-738b-4d5e-a504-31e78722389d\" (UID: \"29c2a7cb-738b-4d5e-a504-31e78722389d\") " Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.037580 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29c2a7cb-738b-4d5e-a504-31e78722389d" (UID: "29c2a7cb-738b-4d5e-a504-31e78722389d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.042870 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld" (OuterVolumeSpecName: "kube-api-access-222ld") pod "29c2a7cb-738b-4d5e-a504-31e78722389d" (UID: "29c2a7cb-738b-4d5e-a504-31e78722389d"). InnerVolumeSpecName "kube-api-access-222ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.138303 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-222ld\" (UniqueName: \"kubernetes.io/projected/29c2a7cb-738b-4d5e-a504-31e78722389d-kube-api-access-222ld\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.138353 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c2a7cb-738b-4d5e-a504-31e78722389d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.679615 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" event={"ID":"29c2a7cb-738b-4d5e-a504-31e78722389d","Type":"ContainerDied","Data":"61008b340c53ce36b76967dae12fcd1ace6d45ff8c797e15a4411ed7317f2755"} Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.679977 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61008b340c53ce36b76967dae12fcd1ace6d45ff8c797e15a4411ed7317f2755" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.679659 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj" Nov 28 13:38:11 crc kubenswrapper[4747]: I1128 13:38:11.973540 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.051503 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvv6w\" (UniqueName: \"kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w\") pod \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.051634 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts\") pod \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\" (UID: \"ece27526-fcae-4ff8-842a-1bebdcb1aec2\") " Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.052416 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ece27526-fcae-4ff8-842a-1bebdcb1aec2" (UID: "ece27526-fcae-4ff8-842a-1bebdcb1aec2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.056268 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w" (OuterVolumeSpecName: "kube-api-access-mvv6w") pod "ece27526-fcae-4ff8-842a-1bebdcb1aec2" (UID: "ece27526-fcae-4ff8-842a-1bebdcb1aec2"). InnerVolumeSpecName "kube-api-access-mvv6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.153932 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvv6w\" (UniqueName: \"kubernetes.io/projected/ece27526-fcae-4ff8-842a-1bebdcb1aec2-kube-api-access-mvv6w\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.153983 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece27526-fcae-4ff8-842a-1bebdcb1aec2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.689843 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" event={"ID":"ece27526-fcae-4ff8-842a-1bebdcb1aec2","Type":"ContainerDied","Data":"e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d"} Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.689889 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c9636e33028a004becb367054924200599fc6c3ddd11a79f3b56e228fa072d" Nov 28 13:38:12 crc kubenswrapper[4747]: I1128 13:38:12.689911 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-pgpzb" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.324529 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d5hw6"] Nov 28 13:38:14 crc kubenswrapper[4747]: E1128 13:38:14.325069 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c2a7cb-738b-4d5e-a504-31e78722389d" containerName="mariadb-account-create-update" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.325081 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c2a7cb-738b-4d5e-a504-31e78722389d" containerName="mariadb-account-create-update" Nov 28 13:38:14 crc kubenswrapper[4747]: E1128 13:38:14.325108 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece27526-fcae-4ff8-842a-1bebdcb1aec2" containerName="mariadb-database-create" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.325115 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece27526-fcae-4ff8-842a-1bebdcb1aec2" containerName="mariadb-database-create" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.325232 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece27526-fcae-4ff8-842a-1bebdcb1aec2" containerName="mariadb-database-create" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.325244 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c2a7cb-738b-4d5e-a504-31e78722389d" containerName="mariadb-account-create-update" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.325700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.328173 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-kstp8" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.330185 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.331526 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.331564 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.331657 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.338732 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d5hw6"] Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.386927 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.387156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkx8\" (UniqueName: \"kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.387386 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.488998 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkx8\" (UniqueName: \"kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.489081 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.489117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.495092 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.495507 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.516775 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkx8\" (UniqueName: \"kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8\") pod \"keystone-db-sync-d5hw6\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:14 crc kubenswrapper[4747]: I1128 13:38:14.643838 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:15 crc kubenswrapper[4747]: I1128 13:38:15.113939 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d5hw6"] Nov 28 13:38:15 crc kubenswrapper[4747]: I1128 13:38:15.720771 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" event={"ID":"db392186-f74e-498f-ab8c-b92e8695028a","Type":"ContainerStarted","Data":"2300a80e951736961c63a743af1abc6816ae8363a35c2b03e48bdf9bfb8b178a"} Nov 28 13:38:15 crc kubenswrapper[4747]: I1128 13:38:15.721018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" event={"ID":"db392186-f74e-498f-ab8c-b92e8695028a","Type":"ContainerStarted","Data":"bcc7e5dd623d659ca59e3e97f0982a0f9531bce5ccb4ae2258557c25f821cf1d"} Nov 28 13:38:17 crc kubenswrapper[4747]: I1128 13:38:17.736482 4747 generic.go:334] "Generic (PLEG): container finished" podID="db392186-f74e-498f-ab8c-b92e8695028a" containerID="2300a80e951736961c63a743af1abc6816ae8363a35c2b03e48bdf9bfb8b178a" exitCode=0 Nov 28 13:38:17 crc kubenswrapper[4747]: I1128 13:38:17.736575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" event={"ID":"db392186-f74e-498f-ab8c-b92e8695028a","Type":"ContainerDied","Data":"2300a80e951736961c63a743af1abc6816ae8363a35c2b03e48bdf9bfb8b178a"} Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.036890 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.156707 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data\") pod \"db392186-f74e-498f-ab8c-b92e8695028a\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.156775 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle\") pod \"db392186-f74e-498f-ab8c-b92e8695028a\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.156861 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkx8\" (UniqueName: \"kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8\") pod \"db392186-f74e-498f-ab8c-b92e8695028a\" (UID: \"db392186-f74e-498f-ab8c-b92e8695028a\") " Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.163172 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8" (OuterVolumeSpecName: "kube-api-access-7tkx8") pod "db392186-f74e-498f-ab8c-b92e8695028a" (UID: "db392186-f74e-498f-ab8c-b92e8695028a"). InnerVolumeSpecName "kube-api-access-7tkx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.178922 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db392186-f74e-498f-ab8c-b92e8695028a" (UID: "db392186-f74e-498f-ab8c-b92e8695028a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.199249 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data" (OuterVolumeSpecName: "config-data") pod "db392186-f74e-498f-ab8c-b92e8695028a" (UID: "db392186-f74e-498f-ab8c-b92e8695028a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.258394 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkx8\" (UniqueName: \"kubernetes.io/projected/db392186-f74e-498f-ab8c-b92e8695028a-kube-api-access-7tkx8\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.258451 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.258461 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db392186-f74e-498f-ab8c-b92e8695028a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.755988 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.755922 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d5hw6" event={"ID":"db392186-f74e-498f-ab8c-b92e8695028a","Type":"ContainerDied","Data":"bcc7e5dd623d659ca59e3e97f0982a0f9531bce5ccb4ae2258557c25f821cf1d"} Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.756112 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc7e5dd623d659ca59e3e97f0982a0f9531bce5ccb4ae2258557c25f821cf1d" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.933664 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-xdgx9"] Nov 28 13:38:19 crc kubenswrapper[4747]: E1128 13:38:19.934241 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db392186-f74e-498f-ab8c-b92e8695028a" containerName="keystone-db-sync" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.934259 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="db392186-f74e-498f-ab8c-b92e8695028a" containerName="keystone-db-sync" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.934412 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="db392186-f74e-498f-ab8c-b92e8695028a" containerName="keystone-db-sync" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.934905 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.939123 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.939165 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.939289 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.939470 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-kstp8" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.939700 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.940008 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.945936 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-xdgx9"] Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965621 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965684 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965748 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965806 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:19 crc kubenswrapper[4747]: I1128 13:38:19.965861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dt6l\" (UniqueName: \"kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.066767 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.066815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.066848 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dt6l\" (UniqueName: \"kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.066910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.066961 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.067007 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.071032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.071224 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.071335 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.072015 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.072427 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.100072 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dt6l\" (UniqueName: \"kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l\") pod \"keystone-bootstrap-xdgx9\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.258395 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.687019 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-xdgx9"] Nov 28 13:38:20 crc kubenswrapper[4747]: I1128 13:38:20.766999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" event={"ID":"42e62d70-9120-4cb6-99e6-abb3bb9bbe50","Type":"ContainerStarted","Data":"4f3254a53908efc2954cce79a3ab1c4474e151ecd7b9e433ba4d6834ec149711"} Nov 28 13:38:21 crc kubenswrapper[4747]: I1128 13:38:21.776407 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" event={"ID":"42e62d70-9120-4cb6-99e6-abb3bb9bbe50","Type":"ContainerStarted","Data":"1a673ccc29f883f6368ca314fb0b5cca2020e9179b2ee355fbf084b46f139d49"} Nov 28 13:38:23 crc kubenswrapper[4747]: I1128 13:38:23.793530 4747 generic.go:334] "Generic (PLEG): container finished" podID="42e62d70-9120-4cb6-99e6-abb3bb9bbe50" containerID="1a673ccc29f883f6368ca314fb0b5cca2020e9179b2ee355fbf084b46f139d49" exitCode=0 Nov 28 13:38:23 crc kubenswrapper[4747]: I1128 13:38:23.793636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" event={"ID":"42e62d70-9120-4cb6-99e6-abb3bb9bbe50","Type":"ContainerDied","Data":"1a673ccc29f883f6368ca314fb0b5cca2020e9179b2ee355fbf084b46f139d49"} Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.064757 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.144872 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.145124 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.145193 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.145247 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.145279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dt6l\" (UniqueName: \"kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.145330 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle\") pod \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\" (UID: \"42e62d70-9120-4cb6-99e6-abb3bb9bbe50\") " Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.158373 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.158647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l" (OuterVolumeSpecName: "kube-api-access-7dt6l") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "kube-api-access-7dt6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.158454 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts" (OuterVolumeSpecName: "scripts") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.158508 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.165506 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data" (OuterVolumeSpecName: "config-data") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.171466 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42e62d70-9120-4cb6-99e6-abb3bb9bbe50" (UID: "42e62d70-9120-4cb6-99e6-abb3bb9bbe50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248556 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248603 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248626 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248643 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248656 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dt6l\" (UniqueName: \"kubernetes.io/projected/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-kube-api-access-7dt6l\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.248671 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e62d70-9120-4cb6-99e6-abb3bb9bbe50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.819362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" event={"ID":"42e62d70-9120-4cb6-99e6-abb3bb9bbe50","Type":"ContainerDied","Data":"4f3254a53908efc2954cce79a3ab1c4474e151ecd7b9e433ba4d6834ec149711"} Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.819427 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f3254a53908efc2954cce79a3ab1c4474e151ecd7b9e433ba4d6834ec149711" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.819534 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-xdgx9" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.913907 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:38:25 crc kubenswrapper[4747]: E1128 13:38:25.914233 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e62d70-9120-4cb6-99e6-abb3bb9bbe50" containerName="keystone-bootstrap" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.914250 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e62d70-9120-4cb6-99e6-abb3bb9bbe50" containerName="keystone-bootstrap" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.914398 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e62d70-9120-4cb6-99e6-abb3bb9bbe50" containerName="keystone-bootstrap" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.914900 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.920846 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.920971 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.921468 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.921512 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.921641 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.921820 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.921964 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-kstp8" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.927879 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956378 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdnl4\" (UniqueName: \"kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956468 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956675 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956735 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:25 crc kubenswrapper[4747]: I1128 13:38:25.956805 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057638 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057697 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057766 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057794 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057818 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057854 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdnl4\" (UniqueName: \"kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.057901 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.061633 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.061754 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.062447 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.062772 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.062773 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.063656 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.065034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.087400 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdnl4\" (UniqueName: \"kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4\") pod \"keystone-86ccdbd99c-68wzz\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.245408 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.524195 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.842114 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" event={"ID":"265f269e-8ab6-4a3a-ad45-ec361868d039","Type":"ContainerStarted","Data":"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0"} Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.842657 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.842690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" event={"ID":"265f269e-8ab6-4a3a-ad45-ec361868d039","Type":"ContainerStarted","Data":"b036a585fb25ff2883f61d367ab24ea23e3b3d8a193f72ff8ad9ae884656dcc4"} Nov 28 13:38:26 crc kubenswrapper[4747]: I1128 13:38:26.889898 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" podStartSLOduration=1.889858588 podStartE2EDuration="1.889858588s" podCreationTimestamp="2025-11-28 13:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:38:26.878143494 +0000 UTC m=+1159.540625254" watchObservedRunningTime="2025-11-28 13:38:26.889858588 +0000 UTC m=+1159.552340358" Nov 28 13:38:57 crc kubenswrapper[4747]: I1128 13:38:57.654300 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.134102 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-xdgx9"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.145151 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d5hw6"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.152218 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-xdgx9"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.158661 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d5hw6"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.164668 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.164913 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" podUID="265f269e-8ab6-4a3a-ad45-ec361868d039" containerName="keystone-api" containerID="cri-o://309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0" gracePeriod=30 Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.195569 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone3f76-account-delete-spgzt"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.196519 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.206102 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3f76-account-delete-spgzt"] Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.246334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.246376 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4k5\" (UniqueName: \"kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.347194 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.347261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4k5\" (UniqueName: \"kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.347955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.368850 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4k5\" (UniqueName: \"kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5\") pod \"keystone3f76-account-delete-spgzt\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.525045 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:38:58 crc kubenswrapper[4747]: I1128 13:38:58.766792 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3f76-account-delete-spgzt"] Nov 28 13:38:59 crc kubenswrapper[4747]: I1128 13:38:59.107181 4747 generic.go:334] "Generic (PLEG): container finished" podID="e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" containerID="268af55e61f04f76dfda085da104d3a3c5df8430103097b31d4c3d813a1d768a" exitCode=0 Nov 28 13:38:59 crc kubenswrapper[4747]: I1128 13:38:59.107271 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" event={"ID":"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d","Type":"ContainerDied","Data":"268af55e61f04f76dfda085da104d3a3c5df8430103097b31d4c3d813a1d768a"} Nov 28 13:38:59 crc kubenswrapper[4747]: I1128 13:38:59.107825 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" event={"ID":"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d","Type":"ContainerStarted","Data":"ee6a6be80c2d050f147ea81f6ef53f4f129a0b2e5c9f6fb0f1e8299e67a2e52a"} Nov 28 13:38:59 crc kubenswrapper[4747]: I1128 13:38:59.656634 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e62d70-9120-4cb6-99e6-abb3bb9bbe50" path="/var/lib/kubelet/pods/42e62d70-9120-4cb6-99e6-abb3bb9bbe50/volumes" Nov 28 13:38:59 crc kubenswrapper[4747]: I1128 13:38:59.657704 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db392186-f74e-498f-ab8c-b92e8695028a" path="/var/lib/kubelet/pods/db392186-f74e-498f-ab8c-b92e8695028a/volumes" Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.419751 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.585496 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4k5\" (UniqueName: \"kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5\") pod \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.585602 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts\") pod \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\" (UID: \"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d\") " Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.586673 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" (UID: "e05a94ce-4fb8-47ad-9f8e-67b721c01a3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.593777 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5" (OuterVolumeSpecName: "kube-api-access-rr4k5") pod "e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" (UID: "e05a94ce-4fb8-47ad-9f8e-67b721c01a3d"). InnerVolumeSpecName "kube-api-access-rr4k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.687777 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4k5\" (UniqueName: \"kubernetes.io/projected/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-kube-api-access-rr4k5\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:00 crc kubenswrapper[4747]: I1128 13:39:00.687839 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.126549 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" event={"ID":"e05a94ce-4fb8-47ad-9f8e-67b721c01a3d","Type":"ContainerDied","Data":"ee6a6be80c2d050f147ea81f6ef53f4f129a0b2e5c9f6fb0f1e8299e67a2e52a"} Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.126609 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee6a6be80c2d050f147ea81f6ef53f4f129a0b2e5c9f6fb0f1e8299e67a2e52a" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.126665 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3f76-account-delete-spgzt" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.762466 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.913755 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914263 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914305 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914375 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914430 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914476 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914536 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdnl4\" (UniqueName: \"kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.914627 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys\") pod \"265f269e-8ab6-4a3a-ad45-ec361868d039\" (UID: \"265f269e-8ab6-4a3a-ad45-ec361868d039\") " Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.919657 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.923261 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4" (OuterVolumeSpecName: "kube-api-access-tdnl4") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "kube-api-access-tdnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.930509 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.937427 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts" (OuterVolumeSpecName: "scripts") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.954243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data" (OuterVolumeSpecName: "config-data") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.954926 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.959080 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:01 crc kubenswrapper[4747]: I1128 13:39:01.971780 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "265f269e-8ab6-4a3a-ad45-ec361868d039" (UID: "265f269e-8ab6-4a3a-ad45-ec361868d039"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016816 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016866 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016884 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdnl4\" (UniqueName: \"kubernetes.io/projected/265f269e-8ab6-4a3a-ad45-ec361868d039-kube-api-access-tdnl4\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016904 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016919 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016935 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016955 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.016972 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/265f269e-8ab6-4a3a-ad45-ec361868d039-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.134812 4747 generic.go:334] "Generic (PLEG): container finished" podID="265f269e-8ab6-4a3a-ad45-ec361868d039" containerID="309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0" exitCode=0 Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.134858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" event={"ID":"265f269e-8ab6-4a3a-ad45-ec361868d039","Type":"ContainerDied","Data":"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0"} Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.134893 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.134914 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-86ccdbd99c-68wzz" event={"ID":"265f269e-8ab6-4a3a-ad45-ec361868d039","Type":"ContainerDied","Data":"b036a585fb25ff2883f61d367ab24ea23e3b3d8a193f72ff8ad9ae884656dcc4"} Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.134937 4747 scope.go:117] "RemoveContainer" containerID="309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.177881 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.183300 4747 scope.go:117] "RemoveContainer" containerID="309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0" Nov 28 13:39:02 crc kubenswrapper[4747]: E1128 13:39:02.183967 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0\": container with ID starting with 309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0 not found: ID does not exist" containerID="309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.184032 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0"} err="failed to get container status \"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0\": rpc error: code = NotFound desc = could not find container \"309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0\": container with ID starting with 309190e38e403fa3204e62c261b5faaec311cb11245427b10154276946cefdd0 not found: ID does not exist" Nov 28 13:39:02 crc kubenswrapper[4747]: I1128 13:39:02.190575 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-86ccdbd99c-68wzz"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.238627 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pgpzb"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.247312 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-pgpzb"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.265812 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone3f76-account-delete-spgzt"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.270343 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.285074 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-3f76-account-create-update-jzdrj"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.289854 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone3f76-account-delete-spgzt"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.422483 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qdv5f"] Nov 28 13:39:03 crc kubenswrapper[4747]: E1128 13:39:03.422779 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265f269e-8ab6-4a3a-ad45-ec361868d039" containerName="keystone-api" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.422798 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="265f269e-8ab6-4a3a-ad45-ec361868d039" containerName="keystone-api" Nov 28 13:39:03 crc kubenswrapper[4747]: E1128 13:39:03.422813 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" containerName="mariadb-account-delete" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.422822 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" containerName="mariadb-account-delete" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.422965 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="265f269e-8ab6-4a3a-ad45-ec361868d039" containerName="keystone-api" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.422995 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" containerName="mariadb-account-delete" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.423518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.431176 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.432007 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.434391 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.438862 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qdv5f"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.446691 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm"] Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.541627 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpbn\" (UniqueName: \"kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.542290 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89sx8\" (UniqueName: \"kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.542433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.542578 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.643424 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89sx8\" (UniqueName: \"kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.643493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.643517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.643628 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpbn\" (UniqueName: \"kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.645023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.645037 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.652677 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265f269e-8ab6-4a3a-ad45-ec361868d039" path="/var/lib/kubelet/pods/265f269e-8ab6-4a3a-ad45-ec361868d039/volumes" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.653505 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c2a7cb-738b-4d5e-a504-31e78722389d" path="/var/lib/kubelet/pods/29c2a7cb-738b-4d5e-a504-31e78722389d/volumes" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.654273 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05a94ce-4fb8-47ad-9f8e-67b721c01a3d" path="/var/lib/kubelet/pods/e05a94ce-4fb8-47ad-9f8e-67b721c01a3d/volumes" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.654831 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece27526-fcae-4ff8-842a-1bebdcb1aec2" path="/var/lib/kubelet/pods/ece27526-fcae-4ff8-842a-1bebdcb1aec2/volumes" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.667260 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89sx8\" (UniqueName: \"kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8\") pod \"keystone-db-create-qdv5f\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.668279 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpbn\" (UniqueName: \"kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn\") pod \"keystone-055a-account-create-update-l7bfm\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.744814 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:03 crc kubenswrapper[4747]: I1128 13:39:03.755596 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:04 crc kubenswrapper[4747]: I1128 13:39:04.270005 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qdv5f"] Nov 28 13:39:04 crc kubenswrapper[4747]: W1128 13:39:04.316138 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89d462d4_339b_4871_aefe_8016e6c653f1.slice/crio-a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8 WatchSource:0}: Error finding container a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8: Status 404 returned error can't find the container with id a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8 Nov 28 13:39:04 crc kubenswrapper[4747]: I1128 13:39:04.316659 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm"] Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.179797 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f1a8ba0-1b11-451f-a736-3258d120b849" containerID="8b0f29fb0424b9090d05cc0fa3a54b5b8919472ba6d9b9801365fc937641f5e9" exitCode=0 Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.180139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" event={"ID":"1f1a8ba0-1b11-451f-a736-3258d120b849","Type":"ContainerDied","Data":"8b0f29fb0424b9090d05cc0fa3a54b5b8919472ba6d9b9801365fc937641f5e9"} Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.180171 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" event={"ID":"1f1a8ba0-1b11-451f-a736-3258d120b849","Type":"ContainerStarted","Data":"572e5a4cfecab40792f90600de6abb86aebd379af4b171aa1fc97cacd133b20a"} Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.182439 4747 generic.go:334] "Generic (PLEG): container finished" podID="89d462d4-339b-4871-aefe-8016e6c653f1" containerID="5a17dfb9a1b0d7553ed30ffdf577d35d1d80dc591537255c25c3f7eb4a82e5cb" exitCode=0 Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.182470 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" event={"ID":"89d462d4-339b-4871-aefe-8016e6c653f1","Type":"ContainerDied","Data":"5a17dfb9a1b0d7553ed30ffdf577d35d1d80dc591537255c25c3f7eb4a82e5cb"} Nov 28 13:39:05 crc kubenswrapper[4747]: I1128 13:39:05.182487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" event={"ID":"89d462d4-339b-4871-aefe-8016e6c653f1","Type":"ContainerStarted","Data":"a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8"} Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.561050 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.566660 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.590154 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts\") pod \"1f1a8ba0-1b11-451f-a736-3258d120b849\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.590270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89sx8\" (UniqueName: \"kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8\") pod \"1f1a8ba0-1b11-451f-a736-3258d120b849\" (UID: \"1f1a8ba0-1b11-451f-a736-3258d120b849\") " Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.590305 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts\") pod \"89d462d4-339b-4871-aefe-8016e6c653f1\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.590366 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkpbn\" (UniqueName: \"kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn\") pod \"89d462d4-339b-4871-aefe-8016e6c653f1\" (UID: \"89d462d4-339b-4871-aefe-8016e6c653f1\") " Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.591032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f1a8ba0-1b11-451f-a736-3258d120b849" (UID: "1f1a8ba0-1b11-451f-a736-3258d120b849"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.591274 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89d462d4-339b-4871-aefe-8016e6c653f1" (UID: "89d462d4-339b-4871-aefe-8016e6c653f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.598639 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8" (OuterVolumeSpecName: "kube-api-access-89sx8") pod "1f1a8ba0-1b11-451f-a736-3258d120b849" (UID: "1f1a8ba0-1b11-451f-a736-3258d120b849"). InnerVolumeSpecName "kube-api-access-89sx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.603374 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn" (OuterVolumeSpecName: "kube-api-access-pkpbn") pod "89d462d4-339b-4871-aefe-8016e6c653f1" (UID: "89d462d4-339b-4871-aefe-8016e6c653f1"). InnerVolumeSpecName "kube-api-access-pkpbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.691776 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f1a8ba0-1b11-451f-a736-3258d120b849-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.692038 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89sx8\" (UniqueName: \"kubernetes.io/projected/1f1a8ba0-1b11-451f-a736-3258d120b849-kube-api-access-89sx8\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.692124 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d462d4-339b-4871-aefe-8016e6c653f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:06 crc kubenswrapper[4747]: I1128 13:39:06.692268 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkpbn\" (UniqueName: \"kubernetes.io/projected/89d462d4-339b-4871-aefe-8016e6c653f1-kube-api-access-pkpbn\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.204463 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" event={"ID":"1f1a8ba0-1b11-451f-a736-3258d120b849","Type":"ContainerDied","Data":"572e5a4cfecab40792f90600de6abb86aebd379af4b171aa1fc97cacd133b20a"} Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.204518 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="572e5a4cfecab40792f90600de6abb86aebd379af4b171aa1fc97cacd133b20a" Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.204519 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qdv5f" Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.206744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" event={"ID":"89d462d4-339b-4871-aefe-8016e6c653f1","Type":"ContainerDied","Data":"a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8"} Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.206793 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09762c5531033366bf8c51e555d79bbd67943d9c2dc8fb244af16da0ada26f8" Nov 28 13:39:07 crc kubenswrapper[4747]: I1128 13:39:07.206872 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.025411 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-xq82t"] Nov 28 13:39:09 crc kubenswrapper[4747]: E1128 13:39:09.026094 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1a8ba0-1b11-451f-a736-3258d120b849" containerName="mariadb-database-create" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.026115 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1a8ba0-1b11-451f-a736-3258d120b849" containerName="mariadb-database-create" Nov 28 13:39:09 crc kubenswrapper[4747]: E1128 13:39:09.026150 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d462d4-339b-4871-aefe-8016e6c653f1" containerName="mariadb-account-create-update" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.026163 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d462d4-339b-4871-aefe-8016e6c653f1" containerName="mariadb-account-create-update" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.026392 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d462d4-339b-4871-aefe-8016e6c653f1" containerName="mariadb-account-create-update" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.026419 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1a8ba0-1b11-451f-a736-3258d120b849" containerName="mariadb-database-create" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.027056 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.030318 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.030364 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.030701 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.030779 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.030855 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9r2\" (UniqueName: \"kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.032007 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-npsr6" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.040872 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-xq82t"] Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.132097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.132157 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9r2\" (UniqueName: \"kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.142387 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.147987 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9r2\" (UniqueName: \"kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2\") pod \"keystone-db-sync-xq82t\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.344722 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:09 crc kubenswrapper[4747]: I1128 13:39:09.610158 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-xq82t"] Nov 28 13:39:10 crc kubenswrapper[4747]: I1128 13:39:10.234795 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" event={"ID":"cc8a87a1-5790-4635-bb73-24555866a07d","Type":"ContainerStarted","Data":"226ce138e3ee2a1ab628d88af01d22feb47e24d10a59f54fed1c808b80dcedc5"} Nov 28 13:39:10 crc kubenswrapper[4747]: I1128 13:39:10.235181 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" event={"ID":"cc8a87a1-5790-4635-bb73-24555866a07d","Type":"ContainerStarted","Data":"0fa90c4d11726cfbf15ae655e56c6514d6cc6f0e17e13bd5d5e80c3774696bcf"} Nov 28 13:39:10 crc kubenswrapper[4747]: I1128 13:39:10.256917 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" podStartSLOduration=2.256894227 podStartE2EDuration="2.256894227s" podCreationTimestamp="2025-11-28 13:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:10.250338352 +0000 UTC m=+1202.912820122" watchObservedRunningTime="2025-11-28 13:39:10.256894227 +0000 UTC m=+1202.919375977" Nov 28 13:39:12 crc kubenswrapper[4747]: I1128 13:39:12.251440 4747 generic.go:334] "Generic (PLEG): container finished" podID="cc8a87a1-5790-4635-bb73-24555866a07d" containerID="226ce138e3ee2a1ab628d88af01d22feb47e24d10a59f54fed1c808b80dcedc5" exitCode=0 Nov 28 13:39:12 crc kubenswrapper[4747]: I1128 13:39:12.251494 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" event={"ID":"cc8a87a1-5790-4635-bb73-24555866a07d","Type":"ContainerDied","Data":"226ce138e3ee2a1ab628d88af01d22feb47e24d10a59f54fed1c808b80dcedc5"} Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.681554 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.803545 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9r2\" (UniqueName: \"kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2\") pod \"cc8a87a1-5790-4635-bb73-24555866a07d\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.803747 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data\") pod \"cc8a87a1-5790-4635-bb73-24555866a07d\" (UID: \"cc8a87a1-5790-4635-bb73-24555866a07d\") " Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.812011 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2" (OuterVolumeSpecName: "kube-api-access-sw9r2") pod "cc8a87a1-5790-4635-bb73-24555866a07d" (UID: "cc8a87a1-5790-4635-bb73-24555866a07d"). InnerVolumeSpecName "kube-api-access-sw9r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.866773 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data" (OuterVolumeSpecName: "config-data") pod "cc8a87a1-5790-4635-bb73-24555866a07d" (UID: "cc8a87a1-5790-4635-bb73-24555866a07d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.905871 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc8a87a1-5790-4635-bb73-24555866a07d-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:13 crc kubenswrapper[4747]: I1128 13:39:13.905913 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9r2\" (UniqueName: \"kubernetes.io/projected/cc8a87a1-5790-4635-bb73-24555866a07d-kube-api-access-sw9r2\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.294662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" event={"ID":"cc8a87a1-5790-4635-bb73-24555866a07d","Type":"ContainerDied","Data":"0fa90c4d11726cfbf15ae655e56c6514d6cc6f0e17e13bd5d5e80c3774696bcf"} Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.294704 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fa90c4d11726cfbf15ae655e56c6514d6cc6f0e17e13bd5d5e80c3774696bcf" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.294759 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-xq82t" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.477859 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r4cpd"] Nov 28 13:39:14 crc kubenswrapper[4747]: E1128 13:39:14.478260 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8a87a1-5790-4635-bb73-24555866a07d" containerName="keystone-db-sync" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.478279 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8a87a1-5790-4635-bb73-24555866a07d" containerName="keystone-db-sync" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.478449 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8a87a1-5790-4635-bb73-24555866a07d" containerName="keystone-db-sync" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.479124 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.483465 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.483695 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-npsr6" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.483852 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.483980 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.484147 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.496228 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r4cpd"] Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.519366 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.519676 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.519801 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.519888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b48d\" (UniqueName: \"kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.519987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.620804 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b48d\" (UniqueName: \"kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.620879 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.620925 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.620987 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.621047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.627289 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.627362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.628121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.634329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.647934 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b48d\" (UniqueName: \"kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d\") pod \"keystone-bootstrap-r4cpd\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:14 crc kubenswrapper[4747]: I1128 13:39:14.804772 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:15 crc kubenswrapper[4747]: I1128 13:39:15.250556 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r4cpd"] Nov 28 13:39:15 crc kubenswrapper[4747]: I1128 13:39:15.306726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" event={"ID":"410b67c8-b149-4a34-b2d9-9b454774a00d","Type":"ContainerStarted","Data":"1b2e27c937ca011333cef9ea367a1aefdfe504fe95d078b5994577d2158d40b9"} Nov 28 13:39:16 crc kubenswrapper[4747]: I1128 13:39:16.318168 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" event={"ID":"410b67c8-b149-4a34-b2d9-9b454774a00d","Type":"ContainerStarted","Data":"1fea151d82f9613ef13905a9946d380a841e9c30cd420a4a5a116490c2719613"} Nov 28 13:39:18 crc kubenswrapper[4747]: I1128 13:39:18.336885 4747 generic.go:334] "Generic (PLEG): container finished" podID="410b67c8-b149-4a34-b2d9-9b454774a00d" containerID="1fea151d82f9613ef13905a9946d380a841e9c30cd420a4a5a116490c2719613" exitCode=0 Nov 28 13:39:18 crc kubenswrapper[4747]: I1128 13:39:18.336968 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" event={"ID":"410b67c8-b149-4a34-b2d9-9b454774a00d","Type":"ContainerDied","Data":"1fea151d82f9613ef13905a9946d380a841e9c30cd420a4a5a116490c2719613"} Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.729767 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.903335 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data\") pod \"410b67c8-b149-4a34-b2d9-9b454774a00d\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.903422 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts\") pod \"410b67c8-b149-4a34-b2d9-9b454774a00d\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.903482 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys\") pod \"410b67c8-b149-4a34-b2d9-9b454774a00d\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.903527 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b48d\" (UniqueName: \"kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d\") pod \"410b67c8-b149-4a34-b2d9-9b454774a00d\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.904497 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys\") pod \"410b67c8-b149-4a34-b2d9-9b454774a00d\" (UID: \"410b67c8-b149-4a34-b2d9-9b454774a00d\") " Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.909387 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts" (OuterVolumeSpecName: "scripts") pod "410b67c8-b149-4a34-b2d9-9b454774a00d" (UID: "410b67c8-b149-4a34-b2d9-9b454774a00d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.909448 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "410b67c8-b149-4a34-b2d9-9b454774a00d" (UID: "410b67c8-b149-4a34-b2d9-9b454774a00d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.909568 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d" (OuterVolumeSpecName: "kube-api-access-8b48d") pod "410b67c8-b149-4a34-b2d9-9b454774a00d" (UID: "410b67c8-b149-4a34-b2d9-9b454774a00d"). InnerVolumeSpecName "kube-api-access-8b48d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.910884 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "410b67c8-b149-4a34-b2d9-9b454774a00d" (UID: "410b67c8-b149-4a34-b2d9-9b454774a00d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4747]: I1128 13:39:19.932945 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data" (OuterVolumeSpecName: "config-data") pod "410b67c8-b149-4a34-b2d9-9b454774a00d" (UID: "410b67c8-b149-4a34-b2d9-9b454774a00d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.006103 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.006139 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.006153 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.006164 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/410b67c8-b149-4a34-b2d9-9b454774a00d-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.006178 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b48d\" (UniqueName: \"kubernetes.io/projected/410b67c8-b149-4a34-b2d9-9b454774a00d-kube-api-access-8b48d\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.355428 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.355436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r4cpd" event={"ID":"410b67c8-b149-4a34-b2d9-9b454774a00d","Type":"ContainerDied","Data":"1b2e27c937ca011333cef9ea367a1aefdfe504fe95d078b5994577d2158d40b9"} Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.355480 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b2e27c937ca011333cef9ea367a1aefdfe504fe95d078b5994577d2158d40b9" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.859393 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:39:20 crc kubenswrapper[4747]: E1128 13:39:20.859776 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410b67c8-b149-4a34-b2d9-9b454774a00d" containerName="keystone-bootstrap" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.859798 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="410b67c8-b149-4a34-b2d9-9b454774a00d" containerName="keystone-bootstrap" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.859989 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="410b67c8-b149-4a34-b2d9-9b454774a00d" containerName="keystone-bootstrap" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.860847 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.863509 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-npsr6" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.863892 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.864064 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.864829 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:20 crc kubenswrapper[4747]: I1128 13:39:20.879323 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.022271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.022348 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.022419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.022572 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.022760 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2dd\" (UniqueName: \"kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.124880 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2dd\" (UniqueName: \"kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.124973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.125010 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.125066 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.125119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.132271 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.132823 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.134834 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.135141 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.150844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2dd\" (UniqueName: \"kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd\") pod \"keystone-dfbffb666-p9gvc\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.190118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:21 crc kubenswrapper[4747]: I1128 13:39:21.693011 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:39:22 crc kubenswrapper[4747]: I1128 13:39:22.373385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" event={"ID":"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a","Type":"ContainerStarted","Data":"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74"} Nov 28 13:39:22 crc kubenswrapper[4747]: I1128 13:39:22.373639 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:39:22 crc kubenswrapper[4747]: I1128 13:39:22.373651 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" event={"ID":"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a","Type":"ContainerStarted","Data":"b5d22748d1a1ac9a8b2136c9a222f43cb5bd572a0da30a2a9ac52bf325311f6b"} Nov 28 13:39:22 crc kubenswrapper[4747]: I1128 13:39:22.390882 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" podStartSLOduration=2.390867564 podStartE2EDuration="2.390867564s" podCreationTimestamp="2025-11-28 13:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:22.387342245 +0000 UTC m=+1215.049823975" watchObservedRunningTime="2025-11-28 13:39:22.390867564 +0000 UTC m=+1215.053349294" Nov 28 13:39:47 crc kubenswrapper[4747]: I1128 13:39:47.632990 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:39:47 crc kubenswrapper[4747]: I1128 13:39:47.633494 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:39:52 crc kubenswrapper[4747]: I1128 13:39:52.587648 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.722568 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-xq82t"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.739594 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r4cpd"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.747174 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r4cpd"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.752459 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-xq82t"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.758263 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.758522 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" podUID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" containerName="keystone-api" containerID="cri-o://99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74" gracePeriod=30 Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.802627 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone055a-account-delete-gdfkp"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.803423 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.818651 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone055a-account-delete-gdfkp"] Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.867544 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.867609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7t2\" (UniqueName: \"kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.969353 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.969455 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7t2\" (UniqueName: \"kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:09 crc kubenswrapper[4747]: I1128 13:40:09.970810 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.002156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7t2\" (UniqueName: \"kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2\") pod \"keystone055a-account-delete-gdfkp\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.120025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.614487 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone055a-account-delete-gdfkp"] Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.812604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" event={"ID":"51f11fd5-2422-4e68-b1f5-44bea4c81197","Type":"ContainerStarted","Data":"605fe20bbef91d588159bc0336459229a048eadec4f34f2b9c41acf5c62067d2"} Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.812650 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" event={"ID":"51f11fd5-2422-4e68-b1f5-44bea4c81197","Type":"ContainerStarted","Data":"7b92b3cf24783608fbac518af0420e43766ef71b48d053a85415b55cf13ce021"} Nov 28 13:40:10 crc kubenswrapper[4747]: I1128 13:40:10.830309 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" podStartSLOduration=1.8302934629999998 podStartE2EDuration="1.830293463s" podCreationTimestamp="2025-11-28 13:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:10.823160234 +0000 UTC m=+1263.485641974" watchObservedRunningTime="2025-11-28 13:40:10.830293463 +0000 UTC m=+1263.492775193" Nov 28 13:40:11 crc kubenswrapper[4747]: I1128 13:40:11.655531 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410b67c8-b149-4a34-b2d9-9b454774a00d" path="/var/lib/kubelet/pods/410b67c8-b149-4a34-b2d9-9b454774a00d/volumes" Nov 28 13:40:11 crc kubenswrapper[4747]: I1128 13:40:11.656742 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8a87a1-5790-4635-bb73-24555866a07d" path="/var/lib/kubelet/pods/cc8a87a1-5790-4635-bb73-24555866a07d/volumes" Nov 28 13:40:11 crc kubenswrapper[4747]: I1128 13:40:11.824113 4747 generic.go:334] "Generic (PLEG): container finished" podID="51f11fd5-2422-4e68-b1f5-44bea4c81197" containerID="605fe20bbef91d588159bc0336459229a048eadec4f34f2b9c41acf5c62067d2" exitCode=0 Nov 28 13:40:11 crc kubenswrapper[4747]: I1128 13:40:11.824194 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" event={"ID":"51f11fd5-2422-4e68-b1f5-44bea4c81197","Type":"ContainerDied","Data":"605fe20bbef91d588159bc0336459229a048eadec4f34f2b9c41acf5c62067d2"} Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.166321 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.224785 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts\") pod \"51f11fd5-2422-4e68-b1f5-44bea4c81197\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.224856 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b7t2\" (UniqueName: \"kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2\") pod \"51f11fd5-2422-4e68-b1f5-44bea4c81197\" (UID: \"51f11fd5-2422-4e68-b1f5-44bea4c81197\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.225450 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51f11fd5-2422-4e68-b1f5-44bea4c81197" (UID: "51f11fd5-2422-4e68-b1f5-44bea4c81197"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.228915 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.230713 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2" (OuterVolumeSpecName: "kube-api-access-4b7t2") pod "51f11fd5-2422-4e68-b1f5-44bea4c81197" (UID: "51f11fd5-2422-4e68-b1f5-44bea4c81197"). InnerVolumeSpecName "kube-api-access-4b7t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326125 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys\") pod \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326238 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys\") pod \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326280 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts\") pod \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326395 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data\") pod \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326467 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz2dd\" (UniqueName: \"kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd\") pod \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\" (UID: \"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a\") " Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326884 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f11fd5-2422-4e68-b1f5-44bea4c81197-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.326921 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b7t2\" (UniqueName: \"kubernetes.io/projected/51f11fd5-2422-4e68-b1f5-44bea4c81197-kube-api-access-4b7t2\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.329493 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" (UID: "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.330321 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd" (OuterVolumeSpecName: "kube-api-access-sz2dd") pod "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" (UID: "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a"). InnerVolumeSpecName "kube-api-access-sz2dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.330370 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts" (OuterVolumeSpecName: "scripts") pod "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" (UID: "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.331032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" (UID: "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.343758 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data" (OuterVolumeSpecName: "config-data") pod "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" (UID: "01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.429540 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.429603 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.429623 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.429641 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.429660 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz2dd\" (UniqueName: \"kubernetes.io/projected/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a-kube-api-access-sz2dd\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.845253 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" event={"ID":"51f11fd5-2422-4e68-b1f5-44bea4c81197","Type":"ContainerDied","Data":"7b92b3cf24783608fbac518af0420e43766ef71b48d053a85415b55cf13ce021"} Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.845300 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b92b3cf24783608fbac518af0420e43766ef71b48d053a85415b55cf13ce021" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.845910 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone055a-account-delete-gdfkp" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.848850 4747 generic.go:334] "Generic (PLEG): container finished" podID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" containerID="99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74" exitCode=0 Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.848904 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.848909 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" event={"ID":"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a","Type":"ContainerDied","Data":"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74"} Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.848971 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-dfbffb666-p9gvc" event={"ID":"01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a","Type":"ContainerDied","Data":"b5d22748d1a1ac9a8b2136c9a222f43cb5bd572a0da30a2a9ac52bf325311f6b"} Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.848992 4747 scope.go:117] "RemoveContainer" containerID="99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.892281 4747 scope.go:117] "RemoveContainer" containerID="99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74" Nov 28 13:40:13 crc kubenswrapper[4747]: E1128 13:40:13.892787 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74\": container with ID starting with 99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74 not found: ID does not exist" containerID="99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.892822 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74"} err="failed to get container status \"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74\": rpc error: code = NotFound desc = could not find container \"99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74\": container with ID starting with 99ef8f36fca2feb6ff5586f2b9d35a628591518dddc31cec8397095e6ae74d74 not found: ID does not exist" Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.894730 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:40:13 crc kubenswrapper[4747]: I1128 13:40:13.904870 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-dfbffb666-p9gvc"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.842552 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qdv5f"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.853380 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qdv5f"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.891505 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.899086 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone055a-account-delete-gdfkp"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.905139 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone055a-account-delete-gdfkp"] Nov 28 13:40:14 crc kubenswrapper[4747]: I1128 13:40:14.911306 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-055a-account-create-update-l7bfm"] Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.125652 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6vhkw"] Nov 28 13:40:15 crc kubenswrapper[4747]: E1128 13:40:15.126238 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" containerName="keystone-api" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.126268 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" containerName="keystone-api" Nov 28 13:40:15 crc kubenswrapper[4747]: E1128 13:40:15.126285 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f11fd5-2422-4e68-b1f5-44bea4c81197" containerName="mariadb-account-delete" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.126296 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f11fd5-2422-4e68-b1f5-44bea4c81197" containerName="mariadb-account-delete" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.126488 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" containerName="keystone-api" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.126514 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f11fd5-2422-4e68-b1f5-44bea4c81197" containerName="mariadb-account-delete" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.127172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.149604 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr"] Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.150784 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.154303 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.157406 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.157579 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfxxm\" (UniqueName: \"kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.169659 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6vhkw"] Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.174969 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr"] Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.258773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7cd6\" (UniqueName: \"kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.258846 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfxxm\" (UniqueName: \"kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.259008 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.259119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.260395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.292726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfxxm\" (UniqueName: \"kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm\") pod \"keystone-db-create-6vhkw\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.360159 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.360345 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7cd6\" (UniqueName: \"kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.362040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.381448 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7cd6\" (UniqueName: \"kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6\") pod \"keystone-5932-account-create-update-dnrrr\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.446484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.469678 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.651407 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a" path="/var/lib/kubelet/pods/01fd9e31-c1e0-427f-8b6f-705aa6ce9d4a/volumes" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.652540 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1a8ba0-1b11-451f-a736-3258d120b849" path="/var/lib/kubelet/pods/1f1a8ba0-1b11-451f-a736-3258d120b849/volumes" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.653377 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f11fd5-2422-4e68-b1f5-44bea4c81197" path="/var/lib/kubelet/pods/51f11fd5-2422-4e68-b1f5-44bea4c81197/volumes" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.654036 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d462d4-339b-4871-aefe-8016e6c653f1" path="/var/lib/kubelet/pods/89d462d4-339b-4871-aefe-8016e6c653f1/volumes" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.723619 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6vhkw"] Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.866895 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" event={"ID":"89f78845-e46a-450c-a48c-d615aaf3f006","Type":"ContainerStarted","Data":"9d4839005f40a70a67864c3ceb511b8bc0d5f41f64bae1bd751dcfebce144ec4"} Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.866963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" event={"ID":"89f78845-e46a-450c-a48c-d615aaf3f006","Type":"ContainerStarted","Data":"100da5101e3f6520ebc0c00718e03baac7b51589069cd40360c1755b7304f223"} Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.885053 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" podStartSLOduration=0.88503364 podStartE2EDuration="885.03364ms" podCreationTimestamp="2025-11-28 13:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:15.881290237 +0000 UTC m=+1268.543771967" watchObservedRunningTime="2025-11-28 13:40:15.88503364 +0000 UTC m=+1268.547515370" Nov 28 13:40:15 crc kubenswrapper[4747]: I1128 13:40:15.967247 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr"] Nov 28 13:40:15 crc kubenswrapper[4747]: W1128 13:40:15.971273 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bf73483_8927_4503_a3fd_4ae240b239a2.slice/crio-02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7 WatchSource:0}: Error finding container 02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7: Status 404 returned error can't find the container with id 02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7 Nov 28 13:40:16 crc kubenswrapper[4747]: I1128 13:40:16.880544 4747 generic.go:334] "Generic (PLEG): container finished" podID="89f78845-e46a-450c-a48c-d615aaf3f006" containerID="9d4839005f40a70a67864c3ceb511b8bc0d5f41f64bae1bd751dcfebce144ec4" exitCode=0 Nov 28 13:40:16 crc kubenswrapper[4747]: I1128 13:40:16.881628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" event={"ID":"89f78845-e46a-450c-a48c-d615aaf3f006","Type":"ContainerDied","Data":"9d4839005f40a70a67864c3ceb511b8bc0d5f41f64bae1bd751dcfebce144ec4"} Nov 28 13:40:16 crc kubenswrapper[4747]: I1128 13:40:16.886006 4747 generic.go:334] "Generic (PLEG): container finished" podID="9bf73483-8927-4503-a3fd-4ae240b239a2" containerID="57018e1f8ec8169e5cc61f7432cefd32875fb3b12efdc67831b75a2655f5b222" exitCode=0 Nov 28 13:40:16 crc kubenswrapper[4747]: I1128 13:40:16.886073 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" event={"ID":"9bf73483-8927-4503-a3fd-4ae240b239a2","Type":"ContainerDied","Data":"57018e1f8ec8169e5cc61f7432cefd32875fb3b12efdc67831b75a2655f5b222"} Nov 28 13:40:16 crc kubenswrapper[4747]: I1128 13:40:16.886112 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" event={"ID":"9bf73483-8927-4503-a3fd-4ae240b239a2","Type":"ContainerStarted","Data":"02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7"} Nov 28 13:40:17 crc kubenswrapper[4747]: I1128 13:40:17.633112 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:40:17 crc kubenswrapper[4747]: I1128 13:40:17.633193 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.303148 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.307598 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.403653 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfxxm\" (UniqueName: \"kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm\") pod \"89f78845-e46a-450c-a48c-d615aaf3f006\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.403750 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts\") pod \"9bf73483-8927-4503-a3fd-4ae240b239a2\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.403901 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7cd6\" (UniqueName: \"kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6\") pod \"9bf73483-8927-4503-a3fd-4ae240b239a2\" (UID: \"9bf73483-8927-4503-a3fd-4ae240b239a2\") " Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.403971 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts\") pod \"89f78845-e46a-450c-a48c-d615aaf3f006\" (UID: \"89f78845-e46a-450c-a48c-d615aaf3f006\") " Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.404867 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89f78845-e46a-450c-a48c-d615aaf3f006" (UID: "89f78845-e46a-450c-a48c-d615aaf3f006"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.404986 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bf73483-8927-4503-a3fd-4ae240b239a2" (UID: "9bf73483-8927-4503-a3fd-4ae240b239a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.410918 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6" (OuterVolumeSpecName: "kube-api-access-b7cd6") pod "9bf73483-8927-4503-a3fd-4ae240b239a2" (UID: "9bf73483-8927-4503-a3fd-4ae240b239a2"). InnerVolumeSpecName "kube-api-access-b7cd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.412325 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm" (OuterVolumeSpecName: "kube-api-access-dfxxm") pod "89f78845-e46a-450c-a48c-d615aaf3f006" (UID: "89f78845-e46a-450c-a48c-d615aaf3f006"). InnerVolumeSpecName "kube-api-access-dfxxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.506028 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f78845-e46a-450c-a48c-d615aaf3f006-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.506148 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfxxm\" (UniqueName: \"kubernetes.io/projected/89f78845-e46a-450c-a48c-d615aaf3f006-kube-api-access-dfxxm\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.506175 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bf73483-8927-4503-a3fd-4ae240b239a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.506196 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7cd6\" (UniqueName: \"kubernetes.io/projected/9bf73483-8927-4503-a3fd-4ae240b239a2-kube-api-access-b7cd6\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.933451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" event={"ID":"9bf73483-8927-4503-a3fd-4ae240b239a2","Type":"ContainerDied","Data":"02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7"} Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.933492 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.933522 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02fe9385352e12135620ac7b22c53cb63eac71a140b741b61fc13c56b84fb8c7" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.937362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" event={"ID":"89f78845-e46a-450c-a48c-d615aaf3f006","Type":"ContainerDied","Data":"100da5101e3f6520ebc0c00718e03baac7b51589069cd40360c1755b7304f223"} Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.937472 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100da5101e3f6520ebc0c00718e03baac7b51589069cd40360c1755b7304f223" Nov 28 13:40:18 crc kubenswrapper[4747]: I1128 13:40:18.937406 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6vhkw" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.704521 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gddwb"] Nov 28 13:40:20 crc kubenswrapper[4747]: E1128 13:40:20.704780 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f78845-e46a-450c-a48c-d615aaf3f006" containerName="mariadb-database-create" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.704793 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f78845-e46a-450c-a48c-d615aaf3f006" containerName="mariadb-database-create" Nov 28 13:40:20 crc kubenswrapper[4747]: E1128 13:40:20.704802 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf73483-8927-4503-a3fd-4ae240b239a2" containerName="mariadb-account-create-update" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.704808 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf73483-8927-4503-a3fd-4ae240b239a2" containerName="mariadb-account-create-update" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.704929 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f78845-e46a-450c-a48c-d615aaf3f006" containerName="mariadb-database-create" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.704951 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf73483-8927-4503-a3fd-4ae240b239a2" containerName="mariadb-account-create-update" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.705457 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.709784 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.710962 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-gnjs4" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.711360 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.713776 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.719437 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gddwb"] Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.775224 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.775498 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdrj\" (UniqueName: \"kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.876884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.876996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdrj\" (UniqueName: \"kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.882590 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:20 crc kubenswrapper[4747]: I1128 13:40:20.898786 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdrj\" (UniqueName: \"kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj\") pod \"keystone-db-sync-gddwb\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:21 crc kubenswrapper[4747]: I1128 13:40:21.023149 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:21 crc kubenswrapper[4747]: I1128 13:40:21.225423 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gddwb"] Nov 28 13:40:21 crc kubenswrapper[4747]: I1128 13:40:21.980114 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" event={"ID":"a854eacc-7a37-48a7-97c7-4cd27fdfc915","Type":"ContainerStarted","Data":"465a919db7d94de464b25ea14d7d3ac670f55204803a0ba32e298f2296b6520e"} Nov 28 13:40:21 crc kubenswrapper[4747]: I1128 13:40:21.980739 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" event={"ID":"a854eacc-7a37-48a7-97c7-4cd27fdfc915","Type":"ContainerStarted","Data":"6c7be6f070a2b5eab1a4505757712a2d9917c2262370e6bddcf6241d0858b8a2"} Nov 28 13:40:22 crc kubenswrapper[4747]: I1128 13:40:22.009016 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" podStartSLOduration=2.008985435 podStartE2EDuration="2.008985435s" podCreationTimestamp="2025-11-28 13:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:22.002521361 +0000 UTC m=+1274.665003101" watchObservedRunningTime="2025-11-28 13:40:22.008985435 +0000 UTC m=+1274.671467185" Nov 28 13:40:22 crc kubenswrapper[4747]: I1128 13:40:22.993581 4747 generic.go:334] "Generic (PLEG): container finished" podID="a854eacc-7a37-48a7-97c7-4cd27fdfc915" containerID="465a919db7d94de464b25ea14d7d3ac670f55204803a0ba32e298f2296b6520e" exitCode=0 Nov 28 13:40:22 crc kubenswrapper[4747]: I1128 13:40:22.993643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" event={"ID":"a854eacc-7a37-48a7-97c7-4cd27fdfc915","Type":"ContainerDied","Data":"465a919db7d94de464b25ea14d7d3ac670f55204803a0ba32e298f2296b6520e"} Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.263744 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.334918 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkdrj\" (UniqueName: \"kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj\") pod \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.335036 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data\") pod \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\" (UID: \"a854eacc-7a37-48a7-97c7-4cd27fdfc915\") " Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.340376 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj" (OuterVolumeSpecName: "kube-api-access-wkdrj") pod "a854eacc-7a37-48a7-97c7-4cd27fdfc915" (UID: "a854eacc-7a37-48a7-97c7-4cd27fdfc915"). InnerVolumeSpecName "kube-api-access-wkdrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.367389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data" (OuterVolumeSpecName: "config-data") pod "a854eacc-7a37-48a7-97c7-4cd27fdfc915" (UID: "a854eacc-7a37-48a7-97c7-4cd27fdfc915"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.437071 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkdrj\" (UniqueName: \"kubernetes.io/projected/a854eacc-7a37-48a7-97c7-4cd27fdfc915-kube-api-access-wkdrj\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:24 crc kubenswrapper[4747]: I1128 13:40:24.437102 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a854eacc-7a37-48a7-97c7-4cd27fdfc915-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.017019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" event={"ID":"a854eacc-7a37-48a7-97c7-4cd27fdfc915","Type":"ContainerDied","Data":"6c7be6f070a2b5eab1a4505757712a2d9917c2262370e6bddcf6241d0858b8a2"} Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.017082 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7be6f070a2b5eab1a4505757712a2d9917c2262370e6bddcf6241d0858b8a2" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.017116 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gddwb" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.202580 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cnhv9"] Nov 28 13:40:25 crc kubenswrapper[4747]: E1128 13:40:25.203405 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a854eacc-7a37-48a7-97c7-4cd27fdfc915" containerName="keystone-db-sync" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.203440 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a854eacc-7a37-48a7-97c7-4cd27fdfc915" containerName="keystone-db-sync" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.203708 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a854eacc-7a37-48a7-97c7-4cd27fdfc915" containerName="keystone-db-sync" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.204516 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.206676 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.207268 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-gnjs4" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.208418 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.209461 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.209684 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.230603 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cnhv9"] Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.250075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrkl\" (UniqueName: \"kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.250130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.250181 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.250259 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.250282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.351031 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.351104 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.351134 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.351156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrkl\" (UniqueName: \"kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.351189 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.356150 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.357071 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.361762 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.368105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.369686 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrkl\" (UniqueName: \"kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl\") pod \"keystone-bootstrap-cnhv9\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.530551 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:25 crc kubenswrapper[4747]: I1128 13:40:25.795409 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cnhv9"] Nov 28 13:40:25 crc kubenswrapper[4747]: W1128 13:40:25.801985 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c11ddd_fa23_4030_ac09_1789e08a0d5f.slice/crio-d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da WatchSource:0}: Error finding container d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da: Status 404 returned error can't find the container with id d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da Nov 28 13:40:26 crc kubenswrapper[4747]: I1128 13:40:26.028380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" event={"ID":"a1c11ddd-fa23-4030-ac09-1789e08a0d5f","Type":"ContainerStarted","Data":"d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da"} Nov 28 13:40:27 crc kubenswrapper[4747]: I1128 13:40:27.038794 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" event={"ID":"a1c11ddd-fa23-4030-ac09-1789e08a0d5f","Type":"ContainerStarted","Data":"ae03412466d114dba4d0f3f8ca3cd005434c7e451a3fd6a0f7fc626b583b5792"} Nov 28 13:40:27 crc kubenswrapper[4747]: I1128 13:40:27.067225 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" podStartSLOduration=2.06719139 podStartE2EDuration="2.06719139s" podCreationTimestamp="2025-11-28 13:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:27.06626161 +0000 UTC m=+1279.728743380" watchObservedRunningTime="2025-11-28 13:40:27.06719139 +0000 UTC m=+1279.729673120" Nov 28 13:40:29 crc kubenswrapper[4747]: I1128 13:40:29.068899 4747 generic.go:334] "Generic (PLEG): container finished" podID="a1c11ddd-fa23-4030-ac09-1789e08a0d5f" containerID="ae03412466d114dba4d0f3f8ca3cd005434c7e451a3fd6a0f7fc626b583b5792" exitCode=0 Nov 28 13:40:29 crc kubenswrapper[4747]: I1128 13:40:29.069032 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" event={"ID":"a1c11ddd-fa23-4030-ac09-1789e08a0d5f","Type":"ContainerDied","Data":"ae03412466d114dba4d0f3f8ca3cd005434c7e451a3fd6a0f7fc626b583b5792"} Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.397378 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.565331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts\") pod \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.565382 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data\") pod \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.565429 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtrkl\" (UniqueName: \"kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl\") pod \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.565463 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys\") pod \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.565501 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys\") pod \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\" (UID: \"a1c11ddd-fa23-4030-ac09-1789e08a0d5f\") " Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.574801 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a1c11ddd-fa23-4030-ac09-1789e08a0d5f" (UID: "a1c11ddd-fa23-4030-ac09-1789e08a0d5f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.574898 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a1c11ddd-fa23-4030-ac09-1789e08a0d5f" (UID: "a1c11ddd-fa23-4030-ac09-1789e08a0d5f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.578517 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl" (OuterVolumeSpecName: "kube-api-access-rtrkl") pod "a1c11ddd-fa23-4030-ac09-1789e08a0d5f" (UID: "a1c11ddd-fa23-4030-ac09-1789e08a0d5f"). InnerVolumeSpecName "kube-api-access-rtrkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.578707 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts" (OuterVolumeSpecName: "scripts") pod "a1c11ddd-fa23-4030-ac09-1789e08a0d5f" (UID: "a1c11ddd-fa23-4030-ac09-1789e08a0d5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.603279 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data" (OuterVolumeSpecName: "config-data") pod "a1c11ddd-fa23-4030-ac09-1789e08a0d5f" (UID: "a1c11ddd-fa23-4030-ac09-1789e08a0d5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.667645 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.667677 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.667689 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtrkl\" (UniqueName: \"kubernetes.io/projected/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-kube-api-access-rtrkl\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.667699 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:30 crc kubenswrapper[4747]: I1128 13:40:30.667707 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1c11ddd-fa23-4030-ac09-1789e08a0d5f-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.086686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" event={"ID":"a1c11ddd-fa23-4030-ac09-1789e08a0d5f","Type":"ContainerDied","Data":"d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da"} Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.086723 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cnhv9" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.086739 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d610a79c8a037f4bda25afdd176a70ed8151088a8d71fdf34884ea6b62a740da" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.165184 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:40:31 crc kubenswrapper[4747]: E1128 13:40:31.166905 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11ddd-fa23-4030-ac09-1789e08a0d5f" containerName="keystone-bootstrap" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.167008 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11ddd-fa23-4030-ac09-1789e08a0d5f" containerName="keystone-bootstrap" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.167267 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c11ddd-fa23-4030-ac09-1789e08a0d5f" containerName="keystone-bootstrap" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.167987 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.171599 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-gnjs4" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.171913 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.172187 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.172682 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.174929 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.277294 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkdzk\" (UniqueName: \"kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.277343 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.277365 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.277388 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.277514 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.379137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkdzk\" (UniqueName: \"kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.379404 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.379539 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.379644 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.379768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.383352 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.383467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.383689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.385069 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.395755 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkdzk\" (UniqueName: \"kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk\") pod \"keystone-66465fdcd5-dvr7p\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.490944 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:31 crc kubenswrapper[4747]: I1128 13:40:31.941069 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:40:32 crc kubenswrapper[4747]: I1128 13:40:32.098560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" event={"ID":"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119","Type":"ContainerStarted","Data":"e0b19e0f614a81d9de7af066618587997d885bc312037114741abbc557d2a1c3"} Nov 28 13:40:32 crc kubenswrapper[4747]: I1128 13:40:32.098960 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" event={"ID":"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119","Type":"ContainerStarted","Data":"52389b9070e3decad73e78aebf1e9a31618fb8b127b68ff944e5ca9a664da3b9"} Nov 28 13:40:32 crc kubenswrapper[4747]: I1128 13:40:32.098992 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:40:32 crc kubenswrapper[4747]: I1128 13:40:32.129714 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" podStartSLOduration=1.129677062 podStartE2EDuration="1.129677062s" podCreationTimestamp="2025-11-28 13:40:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:32.118675237 +0000 UTC m=+1284.781157017" watchObservedRunningTime="2025-11-28 13:40:32.129677062 +0000 UTC m=+1284.792158832" Nov 28 13:40:47 crc kubenswrapper[4747]: I1128 13:40:47.633362 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:40:47 crc kubenswrapper[4747]: I1128 13:40:47.634410 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:40:47 crc kubenswrapper[4747]: I1128 13:40:47.634495 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:40:47 crc kubenswrapper[4747]: I1128 13:40:47.635265 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:40:47 crc kubenswrapper[4747]: I1128 13:40:47.635367 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c" gracePeriod=600 Nov 28 13:40:48 crc kubenswrapper[4747]: I1128 13:40:48.233690 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c" exitCode=0 Nov 28 13:40:48 crc kubenswrapper[4747]: I1128 13:40:48.233715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c"} Nov 28 13:40:48 crc kubenswrapper[4747]: I1128 13:40:48.234105 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161"} Nov 28 13:40:48 crc kubenswrapper[4747]: I1128 13:40:48.234136 4747 scope.go:117] "RemoveContainer" containerID="08e13cafa96480abebcf6277e7d8891630344ed15e24b7ed7d255d3a6f63b7d5" Nov 28 13:41:02 crc kubenswrapper[4747]: I1128 13:41:02.924933 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.315894 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.317508 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.321347 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-7mrrh" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.322451 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.322455 4747 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.328975 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.410412 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.410545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.410809 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8nj5\" (UniqueName: \"kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.512458 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8nj5\" (UniqueName: \"kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.512513 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.512551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.513933 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.533449 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.542518 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8nj5\" (UniqueName: \"kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5\") pod \"openstackclient\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.668788 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.929923 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:41:04 crc kubenswrapper[4747]: I1128 13:41:04.943768 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:41:05 crc kubenswrapper[4747]: I1128 13:41:05.406194 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4379fc67-e837-4fc6-bafd-ccf286d37b67","Type":"ContainerStarted","Data":"f20a72c688327e9eddde26f158adc59982c35292dbcc516d7d73beaa4e194edb"} Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.693058 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.696858 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.702101 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.838415 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.838564 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.838610 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgw67\" (UniqueName: \"kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.940308 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.940445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.940487 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgw67\" (UniqueName: \"kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.941019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.941049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:11 crc kubenswrapper[4747]: I1128 13:41:11.973500 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgw67\" (UniqueName: \"kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67\") pod \"certified-operators-n7x6z\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:12 crc kubenswrapper[4747]: I1128 13:41:12.016056 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:13 crc kubenswrapper[4747]: W1128 13:41:13.784595 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4827d521_de6e_414a_8725_a667a7d5f1b9.slice/crio-2f4d14ba9cd5f398333e13e9f09c431d6e2a89ec13e96a3cf970f7294cfb80c6 WatchSource:0}: Error finding container 2f4d14ba9cd5f398333e13e9f09c431d6e2a89ec13e96a3cf970f7294cfb80c6: Status 404 returned error can't find the container with id 2f4d14ba9cd5f398333e13e9f09c431d6e2a89ec13e96a3cf970f7294cfb80c6 Nov 28 13:41:13 crc kubenswrapper[4747]: I1128 13:41:13.785650 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:14 crc kubenswrapper[4747]: I1128 13:41:14.477706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4379fc67-e837-4fc6-bafd-ccf286d37b67","Type":"ContainerStarted","Data":"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c"} Nov 28 13:41:14 crc kubenswrapper[4747]: I1128 13:41:14.479596 4747 generic.go:334] "Generic (PLEG): container finished" podID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerID="8d86f18a5e4cc6f53c66b4f44762e1f6bbe27993b233539ad244502090e1c540" exitCode=0 Nov 28 13:41:14 crc kubenswrapper[4747]: I1128 13:41:14.479634 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerDied","Data":"8d86f18a5e4cc6f53c66b4f44762e1f6bbe27993b233539ad244502090e1c540"} Nov 28 13:41:14 crc kubenswrapper[4747]: I1128 13:41:14.479654 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerStarted","Data":"2f4d14ba9cd5f398333e13e9f09c431d6e2a89ec13e96a3cf970f7294cfb80c6"} Nov 28 13:41:14 crc kubenswrapper[4747]: I1128 13:41:14.498525 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=2.145504835 podStartE2EDuration="10.498500247s" podCreationTimestamp="2025-11-28 13:41:04 +0000 UTC" firstStartedPulling="2025-11-28 13:41:04.943406799 +0000 UTC m=+1317.605888569" lastFinishedPulling="2025-11-28 13:41:13.296402211 +0000 UTC m=+1325.958883981" observedRunningTime="2025-11-28 13:41:14.492086484 +0000 UTC m=+1327.154568234" watchObservedRunningTime="2025-11-28 13:41:14.498500247 +0000 UTC m=+1327.160982017" Nov 28 13:41:16 crc kubenswrapper[4747]: I1128 13:41:16.498806 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerStarted","Data":"e5b5169cf148daeec64239ccd0f5c69cbb9ab4d3bd8eec3976442f85c6feed1d"} Nov 28 13:41:17 crc kubenswrapper[4747]: I1128 13:41:17.511291 4747 generic.go:334] "Generic (PLEG): container finished" podID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerID="e5b5169cf148daeec64239ccd0f5c69cbb9ab4d3bd8eec3976442f85c6feed1d" exitCode=0 Nov 28 13:41:17 crc kubenswrapper[4747]: I1128 13:41:17.511372 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerDied","Data":"e5b5169cf148daeec64239ccd0f5c69cbb9ab4d3bd8eec3976442f85c6feed1d"} Nov 28 13:41:18 crc kubenswrapper[4747]: I1128 13:41:18.532928 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerStarted","Data":"c22608b98c6d8c802f4a987cab898caeb43eaf7e50163ab5e162272624e43d7e"} Nov 28 13:41:18 crc kubenswrapper[4747]: I1128 13:41:18.561593 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n7x6z" podStartSLOduration=3.968446695 podStartE2EDuration="7.561570648s" podCreationTimestamp="2025-11-28 13:41:11 +0000 UTC" firstStartedPulling="2025-11-28 13:41:14.481158501 +0000 UTC m=+1327.143640231" lastFinishedPulling="2025-11-28 13:41:18.074282444 +0000 UTC m=+1330.736764184" observedRunningTime="2025-11-28 13:41:18.556013314 +0000 UTC m=+1331.218495044" watchObservedRunningTime="2025-11-28 13:41:18.561570648 +0000 UTC m=+1331.224052378" Nov 28 13:41:22 crc kubenswrapper[4747]: I1128 13:41:22.016608 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:22 crc kubenswrapper[4747]: I1128 13:41:22.017372 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:22 crc kubenswrapper[4747]: I1128 13:41:22.098920 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.618034 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.621298 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.657697 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.784748 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.784890 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.785021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xwf\" (UniqueName: \"kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.886019 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.886073 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.886148 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xwf\" (UniqueName: \"kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.886532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.886602 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.906474 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xwf\" (UniqueName: \"kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf\") pod \"redhat-operators-gvwln\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:31 crc kubenswrapper[4747]: I1128 13:41:31.949586 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:32 crc kubenswrapper[4747]: I1128 13:41:32.074147 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:32 crc kubenswrapper[4747]: I1128 13:41:32.182570 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:32 crc kubenswrapper[4747]: W1128 13:41:32.186961 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35df634a_7694_4627_b8b7_ad80526137cb.slice/crio-2e1284212a91e3152965533ad48e6358899b09155396fe550ee578f34c438898 WatchSource:0}: Error finding container 2e1284212a91e3152965533ad48e6358899b09155396fe550ee578f34c438898: Status 404 returned error can't find the container with id 2e1284212a91e3152965533ad48e6358899b09155396fe550ee578f34c438898 Nov 28 13:41:32 crc kubenswrapper[4747]: I1128 13:41:32.690662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerStarted","Data":"2e1284212a91e3152965533ad48e6358899b09155396fe550ee578f34c438898"} Nov 28 13:41:33 crc kubenswrapper[4747]: I1128 13:41:33.702134 4747 generic.go:334] "Generic (PLEG): container finished" podID="35df634a-7694-4627-b8b7-ad80526137cb" containerID="5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5" exitCode=0 Nov 28 13:41:33 crc kubenswrapper[4747]: I1128 13:41:33.702259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerDied","Data":"5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5"} Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.398261 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.399104 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n7x6z" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="registry-server" containerID="cri-o://c22608b98c6d8c802f4a987cab898caeb43eaf7e50163ab5e162272624e43d7e" gracePeriod=2 Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.727981 4747 generic.go:334] "Generic (PLEG): container finished" podID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerID="c22608b98c6d8c802f4a987cab898caeb43eaf7e50163ab5e162272624e43d7e" exitCode=0 Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.728053 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerDied","Data":"c22608b98c6d8c802f4a987cab898caeb43eaf7e50163ab5e162272624e43d7e"} Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.730369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerStarted","Data":"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d"} Nov 28 13:41:34 crc kubenswrapper[4747]: I1128 13:41:34.888632 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.033470 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities\") pod \"4827d521-de6e-414a-8725-a667a7d5f1b9\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.033613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content\") pod \"4827d521-de6e-414a-8725-a667a7d5f1b9\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.033686 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgw67\" (UniqueName: \"kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67\") pod \"4827d521-de6e-414a-8725-a667a7d5f1b9\" (UID: \"4827d521-de6e-414a-8725-a667a7d5f1b9\") " Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.036363 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities" (OuterVolumeSpecName: "utilities") pod "4827d521-de6e-414a-8725-a667a7d5f1b9" (UID: "4827d521-de6e-414a-8725-a667a7d5f1b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.049315 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67" (OuterVolumeSpecName: "kube-api-access-cgw67") pod "4827d521-de6e-414a-8725-a667a7d5f1b9" (UID: "4827d521-de6e-414a-8725-a667a7d5f1b9"). InnerVolumeSpecName "kube-api-access-cgw67". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.113862 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4827d521-de6e-414a-8725-a667a7d5f1b9" (UID: "4827d521-de6e-414a-8725-a667a7d5f1b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.136009 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.136043 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgw67\" (UniqueName: \"kubernetes.io/projected/4827d521-de6e-414a-8725-a667a7d5f1b9-kube-api-access-cgw67\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.136057 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4827d521-de6e-414a-8725-a667a7d5f1b9-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.740738 4747 generic.go:334] "Generic (PLEG): container finished" podID="35df634a-7694-4627-b8b7-ad80526137cb" containerID="35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d" exitCode=0 Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.740833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerDied","Data":"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d"} Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.746068 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7x6z" event={"ID":"4827d521-de6e-414a-8725-a667a7d5f1b9","Type":"ContainerDied","Data":"2f4d14ba9cd5f398333e13e9f09c431d6e2a89ec13e96a3cf970f7294cfb80c6"} Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.746144 4747 scope.go:117] "RemoveContainer" containerID="c22608b98c6d8c802f4a987cab898caeb43eaf7e50163ab5e162272624e43d7e" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.746306 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7x6z" Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.798056 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:35 crc kubenswrapper[4747]: I1128 13:41:35.807834 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n7x6z"] Nov 28 13:41:36 crc kubenswrapper[4747]: I1128 13:41:36.009265 4747 scope.go:117] "RemoveContainer" containerID="e5b5169cf148daeec64239ccd0f5c69cbb9ab4d3bd8eec3976442f85c6feed1d" Nov 28 13:41:36 crc kubenswrapper[4747]: I1128 13:41:36.045615 4747 scope.go:117] "RemoveContainer" containerID="8d86f18a5e4cc6f53c66b4f44762e1f6bbe27993b233539ad244502090e1c540" Nov 28 13:41:37 crc kubenswrapper[4747]: I1128 13:41:37.650417 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" path="/var/lib/kubelet/pods/4827d521-de6e-414a-8725-a667a7d5f1b9/volumes" Nov 28 13:41:37 crc kubenswrapper[4747]: I1128 13:41:37.773505 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerStarted","Data":"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86"} Nov 28 13:41:37 crc kubenswrapper[4747]: I1128 13:41:37.800295 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gvwln" podStartSLOduration=3.274327281 podStartE2EDuration="6.800269237s" podCreationTimestamp="2025-11-28 13:41:31 +0000 UTC" firstStartedPulling="2025-11-28 13:41:33.704013449 +0000 UTC m=+1346.366495219" lastFinishedPulling="2025-11-28 13:41:37.229955415 +0000 UTC m=+1349.892437175" observedRunningTime="2025-11-28 13:41:37.795518771 +0000 UTC m=+1350.458000531" watchObservedRunningTime="2025-11-28 13:41:37.800269237 +0000 UTC m=+1350.462750957" Nov 28 13:41:41 crc kubenswrapper[4747]: I1128 13:41:41.949959 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:41 crc kubenswrapper[4747]: I1128 13:41:41.950593 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:43 crc kubenswrapper[4747]: I1128 13:41:43.008561 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gvwln" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="registry-server" probeResult="failure" output=< Nov 28 13:41:43 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Nov 28 13:41:43 crc kubenswrapper[4747]: > Nov 28 13:41:51 crc kubenswrapper[4747]: I1128 13:41:51.991370 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:52 crc kubenswrapper[4747]: I1128 13:41:52.049074 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:52 crc kubenswrapper[4747]: I1128 13:41:52.223241 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:53 crc kubenswrapper[4747]: I1128 13:41:53.911921 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gvwln" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="registry-server" containerID="cri-o://ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86" gracePeriod=2 Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.356454 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.449843 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content\") pod \"35df634a-7694-4627-b8b7-ad80526137cb\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.449881 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities\") pod \"35df634a-7694-4627-b8b7-ad80526137cb\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.449984 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5xwf\" (UniqueName: \"kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf\") pod \"35df634a-7694-4627-b8b7-ad80526137cb\" (UID: \"35df634a-7694-4627-b8b7-ad80526137cb\") " Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.451496 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities" (OuterVolumeSpecName: "utilities") pod "35df634a-7694-4627-b8b7-ad80526137cb" (UID: "35df634a-7694-4627-b8b7-ad80526137cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.454914 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf" (OuterVolumeSpecName: "kube-api-access-l5xwf") pod "35df634a-7694-4627-b8b7-ad80526137cb" (UID: "35df634a-7694-4627-b8b7-ad80526137cb"). InnerVolumeSpecName "kube-api-access-l5xwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.551984 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.552204 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5xwf\" (UniqueName: \"kubernetes.io/projected/35df634a-7694-4627-b8b7-ad80526137cb-kube-api-access-l5xwf\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.563081 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35df634a-7694-4627-b8b7-ad80526137cb" (UID: "35df634a-7694-4627-b8b7-ad80526137cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.653381 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35df634a-7694-4627-b8b7-ad80526137cb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.926457 4747 generic.go:334] "Generic (PLEG): container finished" podID="35df634a-7694-4627-b8b7-ad80526137cb" containerID="ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86" exitCode=0 Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.926512 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerDied","Data":"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86"} Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.926552 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvwln" event={"ID":"35df634a-7694-4627-b8b7-ad80526137cb","Type":"ContainerDied","Data":"2e1284212a91e3152965533ad48e6358899b09155396fe550ee578f34c438898"} Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.926583 4747 scope.go:117] "RemoveContainer" containerID="ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.926618 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvwln" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.958444 4747 scope.go:117] "RemoveContainer" containerID="35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d" Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.982902 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:54 crc kubenswrapper[4747]: I1128 13:41:54.988521 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gvwln"] Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:54.999991 4747 scope.go:117] "RemoveContainer" containerID="5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.041561 4747 scope.go:117] "RemoveContainer" containerID="ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86" Nov 28 13:41:55 crc kubenswrapper[4747]: E1128 13:41:55.042051 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86\": container with ID starting with ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86 not found: ID does not exist" containerID="ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.042098 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86"} err="failed to get container status \"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86\": rpc error: code = NotFound desc = could not find container \"ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86\": container with ID starting with ef515e598dcef8f8db9ae8614def5417598e574aa631502cad9ee752d93bcb86 not found: ID does not exist" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.042129 4747 scope.go:117] "RemoveContainer" containerID="35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d" Nov 28 13:41:55 crc kubenswrapper[4747]: E1128 13:41:55.042880 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d\": container with ID starting with 35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d not found: ID does not exist" containerID="35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.042915 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d"} err="failed to get container status \"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d\": rpc error: code = NotFound desc = could not find container \"35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d\": container with ID starting with 35fdecb126226c670dfb1483424cffad91f9579cf75ba6537b95cee036d0e52d not found: ID does not exist" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.042935 4747 scope.go:117] "RemoveContainer" containerID="5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5" Nov 28 13:41:55 crc kubenswrapper[4747]: E1128 13:41:55.043298 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5\": container with ID starting with 5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5 not found: ID does not exist" containerID="5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.043333 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5"} err="failed to get container status \"5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5\": rpc error: code = NotFound desc = could not find container \"5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5\": container with ID starting with 5f5a3980275b17acc731ad71c32ad663665e8a453540a3fea0ce36f0811309f5 not found: ID does not exist" Nov 28 13:41:55 crc kubenswrapper[4747]: I1128 13:41:55.652845 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35df634a-7694-4627-b8b7-ad80526137cb" path="/var/lib/kubelet/pods/35df634a-7694-4627-b8b7-ad80526137cb/volumes" Nov 28 13:42:13 crc kubenswrapper[4747]: I1128 13:42:13.226042 4747 scope.go:117] "RemoveContainer" containerID="3bcba4fbd89bfb37e905f045754db6be1edfaf916f9595498a0fa14508e9702d" Nov 28 13:42:13 crc kubenswrapper[4747]: I1128 13:42:13.284383 4747 scope.go:117] "RemoveContainer" containerID="cdb37435c168413a7f19caad7a26b380ce64cddffee70ca39260303d74d95124" Nov 28 13:42:13 crc kubenswrapper[4747]: I1128 13:42:13.316911 4747 scope.go:117] "RemoveContainer" containerID="8cb92922f206b39e1e7636a950f851fbe747745a892cffe8e947a72194a2549b" Nov 28 13:42:13 crc kubenswrapper[4747]: I1128 13:42:13.350956 4747 scope.go:117] "RemoveContainer" containerID="fbd9a16b5ed2077c637d764ab51a37cb259d5ef712746a7062f927806e0630da" Nov 28 13:42:47 crc kubenswrapper[4747]: I1128 13:42:47.633166 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:42:47 crc kubenswrapper[4747]: I1128 13:42:47.633887 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:43:13 crc kubenswrapper[4747]: I1128 13:43:13.454173 4747 scope.go:117] "RemoveContainer" containerID="4f6b4da1b30dc65e0c06a68e6fd79c7e645e2ea40b24fdcad0562b7aa5c88554" Nov 28 13:43:13 crc kubenswrapper[4747]: I1128 13:43:13.503683 4747 scope.go:117] "RemoveContainer" containerID="97625c7c601ccb0cf10013e0335070d5fcaeffa5eb21404929c581232c42eb22" Nov 28 13:43:13 crc kubenswrapper[4747]: I1128 13:43:13.545611 4747 scope.go:117] "RemoveContainer" containerID="143ec3f906d36e2da5a9e2f3ff369af37bc1c2858f0fca4ff478c41dfe1e6555" Nov 28 13:43:13 crc kubenswrapper[4747]: I1128 13:43:13.571063 4747 scope.go:117] "RemoveContainer" containerID="bb73d7f701f97da0d1fbc5805f5de66d8e72d185572d04c974d46a495918d76b" Nov 28 13:43:13 crc kubenswrapper[4747]: I1128 13:43:13.604880 4747 scope.go:117] "RemoveContainer" containerID="4c182fc717424510b860acc9ed3e1ca53439e5dec7fc1bd0c32102c4743a7335" Nov 28 13:43:17 crc kubenswrapper[4747]: I1128 13:43:17.633107 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:43:17 crc kubenswrapper[4747]: I1128 13:43:17.633632 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.562394 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563308 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563329 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563351 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="extract-utilities" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563363 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="extract-utilities" Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563384 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="extract-content" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563397 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="extract-content" Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563422 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="extract-utilities" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563434 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="extract-utilities" Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563449 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563461 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: E1128 13:43:45.563492 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="extract-content" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563503 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="extract-content" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563714 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4827d521-de6e-414a-8725-a667a7d5f1b9" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.563731 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="35df634a-7694-4627-b8b7-ad80526137cb" containerName="registry-server" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.568533 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.593682 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.635980 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.636086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c824\" (UniqueName: \"kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.636159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.737565 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c824\" (UniqueName: \"kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.737667 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.737739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.738156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.738247 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.761304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c824\" (UniqueName: \"kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824\") pod \"redhat-marketplace-85ckn\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:45 crc kubenswrapper[4747]: I1128 13:43:45.891816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:46 crc kubenswrapper[4747]: I1128 13:43:46.125271 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:46 crc kubenswrapper[4747]: I1128 13:43:46.949262 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerID="e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b" exitCode=0 Nov 28 13:43:46 crc kubenswrapper[4747]: I1128 13:43:46.949316 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerDied","Data":"e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b"} Nov 28 13:43:46 crc kubenswrapper[4747]: I1128 13:43:46.949351 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerStarted","Data":"5154c91bac5b11f57bd21973b2aba38d3afd390f558997a3e7d40f6f1a458896"} Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.633203 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.633322 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.633429 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.634360 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.634486 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" gracePeriod=600 Nov 28 13:43:47 crc kubenswrapper[4747]: E1128 13:43:47.784586 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.963107 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" exitCode=0 Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.963242 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161"} Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.963327 4747 scope.go:117] "RemoveContainer" containerID="b2771dccb85c0ecd4859ca56d594c33cf7a03691a61ea3867cc5df5fbf1dd95c" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.963930 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:43:47 crc kubenswrapper[4747]: E1128 13:43:47.964494 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:43:47 crc kubenswrapper[4747]: I1128 13:43:47.967369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerStarted","Data":"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743"} Nov 28 13:43:48 crc kubenswrapper[4747]: I1128 13:43:48.980007 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerID="d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743" exitCode=0 Nov 28 13:43:48 crc kubenswrapper[4747]: I1128 13:43:48.980104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerDied","Data":"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743"} Nov 28 13:43:49 crc kubenswrapper[4747]: I1128 13:43:49.992591 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerStarted","Data":"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a"} Nov 28 13:43:50 crc kubenswrapper[4747]: I1128 13:43:50.019110 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-85ckn" podStartSLOduration=2.573523593 podStartE2EDuration="5.019088918s" podCreationTimestamp="2025-11-28 13:43:45 +0000 UTC" firstStartedPulling="2025-11-28 13:43:46.955707151 +0000 UTC m=+1479.618188891" lastFinishedPulling="2025-11-28 13:43:49.401272476 +0000 UTC m=+1482.063754216" observedRunningTime="2025-11-28 13:43:50.009478231 +0000 UTC m=+1482.671959961" watchObservedRunningTime="2025-11-28 13:43:50.019088918 +0000 UTC m=+1482.681570658" Nov 28 13:43:55 crc kubenswrapper[4747]: I1128 13:43:55.892546 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:55 crc kubenswrapper[4747]: I1128 13:43:55.893566 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:55 crc kubenswrapper[4747]: I1128 13:43:55.938291 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:56 crc kubenswrapper[4747]: I1128 13:43:56.104901 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:56 crc kubenswrapper[4747]: I1128 13:43:56.171835 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:58 crc kubenswrapper[4747]: I1128 13:43:58.065504 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-85ckn" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="registry-server" containerID="cri-o://89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a" gracePeriod=2 Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.007349 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.080834 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerID="89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a" exitCode=0 Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.080915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerDied","Data":"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a"} Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.080943 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85ckn" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.080974 4747 scope.go:117] "RemoveContainer" containerID="89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.080956 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85ckn" event={"ID":"9f9749a4-eb77-4a42-87c2-68f2fe4ff520","Type":"ContainerDied","Data":"5154c91bac5b11f57bd21973b2aba38d3afd390f558997a3e7d40f6f1a458896"} Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.101167 4747 scope.go:117] "RemoveContainer" containerID="d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.118288 4747 scope.go:117] "RemoveContainer" containerID="e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.153718 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities\") pod \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.154504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c824\" (UniqueName: \"kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824\") pod \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.154584 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities" (OuterVolumeSpecName: "utilities") pod "9f9749a4-eb77-4a42-87c2-68f2fe4ff520" (UID: "9f9749a4-eb77-4a42-87c2-68f2fe4ff520"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.154814 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content\") pod \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\" (UID: \"9f9749a4-eb77-4a42-87c2-68f2fe4ff520\") " Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.155359 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.162489 4747 scope.go:117] "RemoveContainer" containerID="89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.162675 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824" (OuterVolumeSpecName: "kube-api-access-6c824") pod "9f9749a4-eb77-4a42-87c2-68f2fe4ff520" (UID: "9f9749a4-eb77-4a42-87c2-68f2fe4ff520"). InnerVolumeSpecName "kube-api-access-6c824". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:43:59 crc kubenswrapper[4747]: E1128 13:43:59.163070 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a\": container with ID starting with 89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a not found: ID does not exist" containerID="89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.163125 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a"} err="failed to get container status \"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a\": rpc error: code = NotFound desc = could not find container \"89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a\": container with ID starting with 89524dbb2c7a7bf42e703c3533ecdf4b0a6b40e93ea8c097fce8d9d71ae2d12a not found: ID does not exist" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.163166 4747 scope.go:117] "RemoveContainer" containerID="d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743" Nov 28 13:43:59 crc kubenswrapper[4747]: E1128 13:43:59.163656 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743\": container with ID starting with d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743 not found: ID does not exist" containerID="d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.163713 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743"} err="failed to get container status \"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743\": rpc error: code = NotFound desc = could not find container \"d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743\": container with ID starting with d6abd7394cc16da24f18a3035dd61372e108a41fd93e67b38b6a1db4d3f25743 not found: ID does not exist" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.163757 4747 scope.go:117] "RemoveContainer" containerID="e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b" Nov 28 13:43:59 crc kubenswrapper[4747]: E1128 13:43:59.165199 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b\": container with ID starting with e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b not found: ID does not exist" containerID="e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.165288 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b"} err="failed to get container status \"e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b\": rpc error: code = NotFound desc = could not find container \"e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b\": container with ID starting with e26091c92a461c965e99c1d8b25770faf0eedb94ed791cbe2ac483e7a90b370b not found: ID does not exist" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.171936 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f9749a4-eb77-4a42-87c2-68f2fe4ff520" (UID: "9f9749a4-eb77-4a42-87c2-68f2fe4ff520"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.256986 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.257030 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c824\" (UniqueName: \"kubernetes.io/projected/9f9749a4-eb77-4a42-87c2-68f2fe4ff520-kube-api-access-6c824\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.424154 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.434129 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-85ckn"] Nov 28 13:43:59 crc kubenswrapper[4747]: I1128 13:43:59.652506 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" path="/var/lib/kubelet/pods/9f9749a4-eb77-4a42-87c2-68f2fe4ff520/volumes" Nov 28 13:44:02 crc kubenswrapper[4747]: I1128 13:44:02.642134 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:44:02 crc kubenswrapper[4747]: E1128 13:44:02.642974 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:44:13 crc kubenswrapper[4747]: I1128 13:44:13.641904 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:44:13 crc kubenswrapper[4747]: E1128 13:44:13.642722 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:44:13 crc kubenswrapper[4747]: I1128 13:44:13.703904 4747 scope.go:117] "RemoveContainer" containerID="9fb0be8d8b4b630123ebf77bca36eaac92269b4d4dac5521ffbbbfb890abf4c4" Nov 28 13:44:13 crc kubenswrapper[4747]: I1128 13:44:13.737629 4747 scope.go:117] "RemoveContainer" containerID="61d53147d8b414e017202486f565ba2add243fe3d10553a1a6f1ded91ec5dd3d" Nov 28 13:44:13 crc kubenswrapper[4747]: I1128 13:44:13.783458 4747 scope.go:117] "RemoveContainer" containerID="034f212cdf3ff034e2095e8361f3d47f3fab0be3d8a17ebf7ce37868d8a18e8f" Nov 28 13:44:24 crc kubenswrapper[4747]: I1128 13:44:24.641778 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:44:24 crc kubenswrapper[4747]: E1128 13:44:24.643533 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:44:36 crc kubenswrapper[4747]: I1128 13:44:36.643577 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:44:36 crc kubenswrapper[4747]: E1128 13:44:36.645558 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:44:49 crc kubenswrapper[4747]: I1128 13:44:49.641801 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:44:49 crc kubenswrapper[4747]: E1128 13:44:49.642866 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.157193 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt"] Nov 28 13:45:00 crc kubenswrapper[4747]: E1128 13:45:00.157981 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="registry-server" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.158006 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="registry-server" Nov 28 13:45:00 crc kubenswrapper[4747]: E1128 13:45:00.158038 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="extract-utilities" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.158049 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="extract-utilities" Nov 28 13:45:00 crc kubenswrapper[4747]: E1128 13:45:00.158069 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="extract-content" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.158080 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="extract-content" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.158308 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9749a4-eb77-4a42-87c2-68f2fe4ff520" containerName="registry-server" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.159011 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.163040 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.163341 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.165352 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt"] Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.248413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s27h\" (UniqueName: \"kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.248543 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.248630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.349135 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s27h\" (UniqueName: \"kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.349200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.349248 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.350015 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.358883 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.372594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s27h\" (UniqueName: \"kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h\") pod \"collect-profiles-29405625-h4lvt\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.479024 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.642090 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:45:00 crc kubenswrapper[4747]: E1128 13:45:00.642609 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.707023 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt"] Nov 28 13:45:00 crc kubenswrapper[4747]: I1128 13:45:00.854748 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" event={"ID":"7a884e45-e580-452e-8898-9838e7b2d309","Type":"ContainerStarted","Data":"b876ce6bc5535a0e375024c086711807d52c70bfb763ecca866ada48c19a392e"} Nov 28 13:45:01 crc kubenswrapper[4747]: I1128 13:45:01.864667 4747 generic.go:334] "Generic (PLEG): container finished" podID="7a884e45-e580-452e-8898-9838e7b2d309" containerID="2dfe590f0feae7f6dfafbcd2c0eae30712f0d6e0e1f4cf0f78cac3fcfb8ae532" exitCode=0 Nov 28 13:45:01 crc kubenswrapper[4747]: I1128 13:45:01.864731 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" event={"ID":"7a884e45-e580-452e-8898-9838e7b2d309","Type":"ContainerDied","Data":"2dfe590f0feae7f6dfafbcd2c0eae30712f0d6e0e1f4cf0f78cac3fcfb8ae532"} Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.181435 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.292101 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s27h\" (UniqueName: \"kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h\") pod \"7a884e45-e580-452e-8898-9838e7b2d309\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.292186 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume\") pod \"7a884e45-e580-452e-8898-9838e7b2d309\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.292343 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume\") pod \"7a884e45-e580-452e-8898-9838e7b2d309\" (UID: \"7a884e45-e580-452e-8898-9838e7b2d309\") " Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.293668 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a884e45-e580-452e-8898-9838e7b2d309" (UID: "7a884e45-e580-452e-8898-9838e7b2d309"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.303027 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a884e45-e580-452e-8898-9838e7b2d309" (UID: "7a884e45-e580-452e-8898-9838e7b2d309"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.303335 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h" (OuterVolumeSpecName: "kube-api-access-5s27h") pod "7a884e45-e580-452e-8898-9838e7b2d309" (UID: "7a884e45-e580-452e-8898-9838e7b2d309"). InnerVolumeSpecName "kube-api-access-5s27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.394119 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s27h\" (UniqueName: \"kubernetes.io/projected/7a884e45-e580-452e-8898-9838e7b2d309-kube-api-access-5s27h\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.394178 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a884e45-e580-452e-8898-9838e7b2d309-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.394197 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a884e45-e580-452e-8898-9838e7b2d309-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.884883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" event={"ID":"7a884e45-e580-452e-8898-9838e7b2d309","Type":"ContainerDied","Data":"b876ce6bc5535a0e375024c086711807d52c70bfb763ecca866ada48c19a392e"} Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.885278 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b876ce6bc5535a0e375024c086711807d52c70bfb763ecca866ada48c19a392e" Nov 28 13:45:03 crc kubenswrapper[4747]: I1128 13:45:03.884970 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-h4lvt" Nov 28 13:45:12 crc kubenswrapper[4747]: I1128 13:45:12.641181 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:45:12 crc kubenswrapper[4747]: E1128 13:45:12.642121 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:45:13 crc kubenswrapper[4747]: I1128 13:45:13.874740 4747 scope.go:117] "RemoveContainer" containerID="2300a80e951736961c63a743af1abc6816ae8363a35c2b03e48bdf9bfb8b178a" Nov 28 13:45:13 crc kubenswrapper[4747]: I1128 13:45:13.913774 4747 scope.go:117] "RemoveContainer" containerID="226ce138e3ee2a1ab628d88af01d22feb47e24d10a59f54fed1c808b80dcedc5" Nov 28 13:45:13 crc kubenswrapper[4747]: I1128 13:45:13.941347 4747 scope.go:117] "RemoveContainer" containerID="268af55e61f04f76dfda085da104d3a3c5df8430103097b31d4c3d813a1d768a" Nov 28 13:45:13 crc kubenswrapper[4747]: I1128 13:45:13.969278 4747 scope.go:117] "RemoveContainer" containerID="5a17dfb9a1b0d7553ed30ffdf577d35d1d80dc591537255c25c3f7eb4a82e5cb" Nov 28 13:45:13 crc kubenswrapper[4747]: I1128 13:45:13.991102 4747 scope.go:117] "RemoveContainer" containerID="8b0f29fb0424b9090d05cc0fa3a54b5b8919472ba6d9b9801365fc937641f5e9" Nov 28 13:45:14 crc kubenswrapper[4747]: I1128 13:45:14.009770 4747 scope.go:117] "RemoveContainer" containerID="1a673ccc29f883f6368ca314fb0b5cca2020e9179b2ee355fbf084b46f139d49" Nov 28 13:45:27 crc kubenswrapper[4747]: I1128 13:45:27.646192 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:45:27 crc kubenswrapper[4747]: E1128 13:45:27.646744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:45:38 crc kubenswrapper[4747]: I1128 13:45:38.641855 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:45:38 crc kubenswrapper[4747]: E1128 13:45:38.642746 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:45:50 crc kubenswrapper[4747]: I1128 13:45:50.642238 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:45:50 crc kubenswrapper[4747]: E1128 13:45:50.643260 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:46:02 crc kubenswrapper[4747]: I1128 13:46:02.641859 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:46:02 crc kubenswrapper[4747]: E1128 13:46:02.642830 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:46:13 crc kubenswrapper[4747]: I1128 13:46:13.642082 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:46:13 crc kubenswrapper[4747]: E1128 13:46:13.643295 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:46:14 crc kubenswrapper[4747]: I1128 13:46:14.101245 4747 scope.go:117] "RemoveContainer" containerID="605fe20bbef91d588159bc0336459229a048eadec4f34f2b9c41acf5c62067d2" Nov 28 13:46:14 crc kubenswrapper[4747]: I1128 13:46:14.139284 4747 scope.go:117] "RemoveContainer" containerID="1fea151d82f9613ef13905a9946d380a841e9c30cd420a4a5a116490c2719613" Nov 28 13:46:27 crc kubenswrapper[4747]: I1128 13:46:27.648198 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:46:27 crc kubenswrapper[4747]: E1128 13:46:27.649420 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:46:41 crc kubenswrapper[4747]: I1128 13:46:41.641692 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:46:41 crc kubenswrapper[4747]: E1128 13:46:41.642496 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:46:55 crc kubenswrapper[4747]: I1128 13:46:55.641763 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:46:55 crc kubenswrapper[4747]: E1128 13:46:55.642696 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:47:10 crc kubenswrapper[4747]: I1128 13:47:10.641072 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:47:10 crc kubenswrapper[4747]: E1128 13:47:10.641818 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:47:23 crc kubenswrapper[4747]: I1128 13:47:23.641095 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:47:23 crc kubenswrapper[4747]: E1128 13:47:23.641998 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:47:38 crc kubenswrapper[4747]: I1128 13:47:38.642183 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:47:38 crc kubenswrapper[4747]: E1128 13:47:38.642837 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:47:51 crc kubenswrapper[4747]: I1128 13:47:51.642286 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:47:51 crc kubenswrapper[4747]: E1128 13:47:51.643607 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:48:05 crc kubenswrapper[4747]: I1128 13:48:05.641725 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:48:05 crc kubenswrapper[4747]: E1128 13:48:05.642540 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:48:18 crc kubenswrapper[4747]: I1128 13:48:18.641712 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:48:18 crc kubenswrapper[4747]: E1128 13:48:18.642948 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:48:29 crc kubenswrapper[4747]: I1128 13:48:29.641851 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:48:29 crc kubenswrapper[4747]: E1128 13:48:29.643132 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:48:44 crc kubenswrapper[4747]: I1128 13:48:44.642755 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:48:44 crc kubenswrapper[4747]: E1128 13:48:44.643952 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:48:56 crc kubenswrapper[4747]: I1128 13:48:56.641248 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:48:57 crc kubenswrapper[4747]: I1128 13:48:57.125078 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192"} Nov 28 13:49:35 crc kubenswrapper[4747]: I1128 13:49:35.969688 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:35 crc kubenswrapper[4747]: E1128 13:49:35.970576 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a884e45-e580-452e-8898-9838e7b2d309" containerName="collect-profiles" Nov 28 13:49:35 crc kubenswrapper[4747]: I1128 13:49:35.970593 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a884e45-e580-452e-8898-9838e7b2d309" containerName="collect-profiles" Nov 28 13:49:35 crc kubenswrapper[4747]: I1128 13:49:35.970738 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a884e45-e580-452e-8898-9838e7b2d309" containerName="collect-profiles" Nov 28 13:49:35 crc kubenswrapper[4747]: I1128 13:49:35.971873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:35 crc kubenswrapper[4747]: I1128 13:49:35.979732 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.070347 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.070439 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jjk\" (UniqueName: \"kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.070537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.171735 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.172058 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.172099 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jjk\" (UniqueName: \"kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.172398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.172627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.189638 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jjk\" (UniqueName: \"kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk\") pod \"community-operators-975f7\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.292165 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:36 crc kubenswrapper[4747]: I1128 13:49:36.527164 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:37 crc kubenswrapper[4747]: I1128 13:49:37.442951 4747 generic.go:334] "Generic (PLEG): container finished" podID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerID="25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f" exitCode=0 Nov 28 13:49:37 crc kubenswrapper[4747]: I1128 13:49:37.442990 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerDied","Data":"25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f"} Nov 28 13:49:37 crc kubenswrapper[4747]: I1128 13:49:37.443016 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerStarted","Data":"3a3d6fdc0cb37f078e0e46f05ec75bdb43124e9d43b79b73130abb147aa48e5a"} Nov 28 13:49:37 crc kubenswrapper[4747]: I1128 13:49:37.445859 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:49:39 crc kubenswrapper[4747]: I1128 13:49:39.462567 4747 generic.go:334] "Generic (PLEG): container finished" podID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerID="86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c" exitCode=0 Nov 28 13:49:39 crc kubenswrapper[4747]: I1128 13:49:39.462727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerDied","Data":"86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c"} Nov 28 13:49:40 crc kubenswrapper[4747]: I1128 13:49:40.472687 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerStarted","Data":"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3"} Nov 28 13:49:40 crc kubenswrapper[4747]: I1128 13:49:40.493993 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-975f7" podStartSLOduration=2.923715987 podStartE2EDuration="5.49397453s" podCreationTimestamp="2025-11-28 13:49:35 +0000 UTC" firstStartedPulling="2025-11-28 13:49:37.44566325 +0000 UTC m=+1830.108144980" lastFinishedPulling="2025-11-28 13:49:40.015921793 +0000 UTC m=+1832.678403523" observedRunningTime="2025-11-28 13:49:40.492064982 +0000 UTC m=+1833.154546732" watchObservedRunningTime="2025-11-28 13:49:40.49397453 +0000 UTC m=+1833.156456260" Nov 28 13:49:46 crc kubenswrapper[4747]: I1128 13:49:46.292655 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:46 crc kubenswrapper[4747]: I1128 13:49:46.294594 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:46 crc kubenswrapper[4747]: I1128 13:49:46.368015 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:46 crc kubenswrapper[4747]: I1128 13:49:46.582437 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:49 crc kubenswrapper[4747]: I1128 13:49:49.941696 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:49 crc kubenswrapper[4747]: I1128 13:49:49.942721 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-975f7" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="registry-server" containerID="cri-o://15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3" gracePeriod=2 Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.416339 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.518043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities\") pod \"43361946-9a0e-48a1-81ef-b18e024d21b8\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.518153 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content\") pod \"43361946-9a0e-48a1-81ef-b18e024d21b8\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.518252 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jjk\" (UniqueName: \"kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk\") pod \"43361946-9a0e-48a1-81ef-b18e024d21b8\" (UID: \"43361946-9a0e-48a1-81ef-b18e024d21b8\") " Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.519666 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities" (OuterVolumeSpecName: "utilities") pod "43361946-9a0e-48a1-81ef-b18e024d21b8" (UID: "43361946-9a0e-48a1-81ef-b18e024d21b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.529486 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk" (OuterVolumeSpecName: "kube-api-access-m7jjk") pod "43361946-9a0e-48a1-81ef-b18e024d21b8" (UID: "43361946-9a0e-48a1-81ef-b18e024d21b8"). InnerVolumeSpecName "kube-api-access-m7jjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.560128 4747 generic.go:334] "Generic (PLEG): container finished" podID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerID="15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3" exitCode=0 Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.560182 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-975f7" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.560190 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerDied","Data":"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3"} Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.560255 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-975f7" event={"ID":"43361946-9a0e-48a1-81ef-b18e024d21b8","Type":"ContainerDied","Data":"3a3d6fdc0cb37f078e0e46f05ec75bdb43124e9d43b79b73130abb147aa48e5a"} Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.560293 4747 scope.go:117] "RemoveContainer" containerID="15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.580678 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43361946-9a0e-48a1-81ef-b18e024d21b8" (UID: "43361946-9a0e-48a1-81ef-b18e024d21b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.582917 4747 scope.go:117] "RemoveContainer" containerID="86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.609955 4747 scope.go:117] "RemoveContainer" containerID="25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.619564 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jjk\" (UniqueName: \"kubernetes.io/projected/43361946-9a0e-48a1-81ef-b18e024d21b8-kube-api-access-m7jjk\") on node \"crc\" DevicePath \"\"" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.619595 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.619604 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43361946-9a0e-48a1-81ef-b18e024d21b8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.633090 4747 scope.go:117] "RemoveContainer" containerID="15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3" Nov 28 13:49:51 crc kubenswrapper[4747]: E1128 13:49:51.636673 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3\": container with ID starting with 15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3 not found: ID does not exist" containerID="15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.636728 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3"} err="failed to get container status \"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3\": rpc error: code = NotFound desc = could not find container \"15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3\": container with ID starting with 15465c67809915fa39e8161527b79ba53b9294c8d7456584ab262586ca6353b3 not found: ID does not exist" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.636762 4747 scope.go:117] "RemoveContainer" containerID="86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c" Nov 28 13:49:51 crc kubenswrapper[4747]: E1128 13:49:51.637051 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c\": container with ID starting with 86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c not found: ID does not exist" containerID="86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.637101 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c"} err="failed to get container status \"86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c\": rpc error: code = NotFound desc = could not find container \"86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c\": container with ID starting with 86f42c047cba7c4ff6b4645e01f6d7d439bba69757e5ed0e046853bbdb2db96c not found: ID does not exist" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.637119 4747 scope.go:117] "RemoveContainer" containerID="25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f" Nov 28 13:49:51 crc kubenswrapper[4747]: E1128 13:49:51.637613 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f\": container with ID starting with 25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f not found: ID does not exist" containerID="25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.637647 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f"} err="failed to get container status \"25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f\": rpc error: code = NotFound desc = could not find container \"25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f\": container with ID starting with 25889776ddf3db0335232227e2d6c3b64e5e3e1409320509e8fd3f9d9ff9da5f not found: ID does not exist" Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.884012 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:51 crc kubenswrapper[4747]: I1128 13:49:51.888479 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-975f7"] Nov 28 13:49:53 crc kubenswrapper[4747]: I1128 13:49:53.655108 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" path="/var/lib/kubelet/pods/43361946-9a0e-48a1-81ef-b18e024d21b8/volumes" Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.320550 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr"] Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.324329 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6vhkw"] Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.330611 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5932-account-create-update-dnrrr"] Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.337517 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6vhkw"] Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.649847 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f78845-e46a-450c-a48c-d615aaf3f006" path="/var/lib/kubelet/pods/89f78845-e46a-450c-a48c-d615aaf3f006/volumes" Nov 28 13:50:19 crc kubenswrapper[4747]: I1128 13:50:19.650391 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf73483-8927-4503-a3fd-4ae240b239a2" path="/var/lib/kubelet/pods/9bf73483-8927-4503-a3fd-4ae240b239a2/volumes" Nov 28 13:50:25 crc kubenswrapper[4747]: I1128 13:50:25.034360 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gddwb"] Nov 28 13:50:25 crc kubenswrapper[4747]: I1128 13:50:25.040555 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gddwb"] Nov 28 13:50:25 crc kubenswrapper[4747]: I1128 13:50:25.649307 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a854eacc-7a37-48a7-97c7-4cd27fdfc915" path="/var/lib/kubelet/pods/a854eacc-7a37-48a7-97c7-4cd27fdfc915/volumes" Nov 28 13:50:31 crc kubenswrapper[4747]: I1128 13:50:31.023615 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cnhv9"] Nov 28 13:50:31 crc kubenswrapper[4747]: I1128 13:50:31.028417 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cnhv9"] Nov 28 13:50:31 crc kubenswrapper[4747]: I1128 13:50:31.652879 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c11ddd-fa23-4030-ac09-1789e08a0d5f" path="/var/lib/kubelet/pods/a1c11ddd-fa23-4030-ac09-1789e08a0d5f/volumes" Nov 28 13:51:14 crc kubenswrapper[4747]: I1128 13:51:14.317969 4747 scope.go:117] "RemoveContainer" containerID="465a919db7d94de464b25ea14d7d3ac670f55204803a0ba32e298f2296b6520e" Nov 28 13:51:14 crc kubenswrapper[4747]: I1128 13:51:14.353300 4747 scope.go:117] "RemoveContainer" containerID="57018e1f8ec8169e5cc61f7432cefd32875fb3b12efdc67831b75a2655f5b222" Nov 28 13:51:14 crc kubenswrapper[4747]: I1128 13:51:14.375110 4747 scope.go:117] "RemoveContainer" containerID="ae03412466d114dba4d0f3f8ca3cd005434c7e451a3fd6a0f7fc626b583b5792" Nov 28 13:51:14 crc kubenswrapper[4747]: I1128 13:51:14.405074 4747 scope.go:117] "RemoveContainer" containerID="9d4839005f40a70a67864c3ceb511b8bc0d5f41f64bae1bd751dcfebce144ec4" Nov 28 13:51:17 crc kubenswrapper[4747]: I1128 13:51:17.633257 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:51:17 crc kubenswrapper[4747]: I1128 13:51:17.633720 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:51:21 crc kubenswrapper[4747]: I1128 13:51:21.660142 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:51:21 crc kubenswrapper[4747]: I1128 13:51:21.660706 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="4379fc67-e837-4fc6-bafd-ccf286d37b67" containerName="openstackclient" containerID="cri-o://20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c" gracePeriod=30 Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.072538 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.226122 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret\") pod \"4379fc67-e837-4fc6-bafd-ccf286d37b67\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.226168 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config\") pod \"4379fc67-e837-4fc6-bafd-ccf286d37b67\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.226282 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8nj5\" (UniqueName: \"kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5\") pod \"4379fc67-e837-4fc6-bafd-ccf286d37b67\" (UID: \"4379fc67-e837-4fc6-bafd-ccf286d37b67\") " Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.234635 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5" (OuterVolumeSpecName: "kube-api-access-g8nj5") pod "4379fc67-e837-4fc6-bafd-ccf286d37b67" (UID: "4379fc67-e837-4fc6-bafd-ccf286d37b67"). InnerVolumeSpecName "kube-api-access-g8nj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.250300 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4379fc67-e837-4fc6-bafd-ccf286d37b67" (UID: "4379fc67-e837-4fc6-bafd-ccf286d37b67"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.250465 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4379fc67-e837-4fc6-bafd-ccf286d37b67" (UID: "4379fc67-e837-4fc6-bafd-ccf286d37b67"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.310364 4747 generic.go:334] "Generic (PLEG): container finished" podID="4379fc67-e837-4fc6-bafd-ccf286d37b67" containerID="20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c" exitCode=143 Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.310428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4379fc67-e837-4fc6-bafd-ccf286d37b67","Type":"ContainerDied","Data":"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c"} Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.310485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4379fc67-e837-4fc6-bafd-ccf286d37b67","Type":"ContainerDied","Data":"f20a72c688327e9eddde26f158adc59982c35292dbcc516d7d73beaa4e194edb"} Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.310506 4747 scope.go:117] "RemoveContainer" containerID="20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.310806 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.328063 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.328091 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4379fc67-e837-4fc6-bafd-ccf286d37b67-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.328101 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8nj5\" (UniqueName: \"kubernetes.io/projected/4379fc67-e837-4fc6-bafd-ccf286d37b67-kube-api-access-g8nj5\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.328773 4747 scope.go:117] "RemoveContainer" containerID="20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c" Nov 28 13:51:22 crc kubenswrapper[4747]: E1128 13:51:22.329298 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c\": container with ID starting with 20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c not found: ID does not exist" containerID="20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.329377 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c"} err="failed to get container status \"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c\": rpc error: code = NotFound desc = could not find container \"20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c\": container with ID starting with 20ee4214ca006a1794784c36b2e1bc8a7b53b7945a3b3ef3052905be925da98c not found: ID does not exist" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.345728 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.350851 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.461844 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.462093 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" podUID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" containerName="keystone-api" containerID="cri-o://e0b19e0f614a81d9de7af066618587997d885bc312037114741abbc557d2a1c3" gracePeriod=30 Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.520744 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone5932-account-delete-9hbzb"] Nov 28 13:51:22 crc kubenswrapper[4747]: E1128 13:51:22.521063 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="extract-content" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521087 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="extract-content" Nov 28 13:51:22 crc kubenswrapper[4747]: E1128 13:51:22.521107 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="registry-server" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521115 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="registry-server" Nov 28 13:51:22 crc kubenswrapper[4747]: E1128 13:51:22.521137 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4379fc67-e837-4fc6-bafd-ccf286d37b67" containerName="openstackclient" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521147 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4379fc67-e837-4fc6-bafd-ccf286d37b67" containerName="openstackclient" Nov 28 13:51:22 crc kubenswrapper[4747]: E1128 13:51:22.521160 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="extract-utilities" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521168 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="extract-utilities" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521312 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4379fc67-e837-4fc6-bafd-ccf286d37b67" containerName="openstackclient" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521328 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="43361946-9a0e-48a1-81ef-b18e024d21b8" containerName="registry-server" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.521777 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.529277 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5932-account-delete-9hbzb"] Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.635151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.635202 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7zm\" (UniqueName: \"kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.736663 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.736728 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7zm\" (UniqueName: \"kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.737480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.753536 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7zm\" (UniqueName: \"kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm\") pod \"keystone5932-account-delete-9hbzb\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:22 crc kubenswrapper[4747]: I1128 13:51:22.835790 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:23 crc kubenswrapper[4747]: I1128 13:51:23.023306 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5932-account-delete-9hbzb"] Nov 28 13:51:23 crc kubenswrapper[4747]: I1128 13:51:23.318906 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" event={"ID":"e2d9d2f6-af70-446e-92b0-d41d8af9f656","Type":"ContainerStarted","Data":"07dc9cd2c22b1b6782dde34aa0b21c254e35cabadc778a9fd8e90792c4d68599"} Nov 28 13:51:23 crc kubenswrapper[4747]: I1128 13:51:23.318958 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" event={"ID":"e2d9d2f6-af70-446e-92b0-d41d8af9f656","Type":"ContainerStarted","Data":"804d6956ac41638d0e6837f530a8b952e2613a7cd59b407ead1dd9cfb1e02d4d"} Nov 28 13:51:23 crc kubenswrapper[4747]: I1128 13:51:23.650974 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4379fc67-e837-4fc6-bafd-ccf286d37b67" path="/var/lib/kubelet/pods/4379fc67-e837-4fc6-bafd-ccf286d37b67/volumes" Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.331772 4747 generic.go:334] "Generic (PLEG): container finished" podID="e2d9d2f6-af70-446e-92b0-d41d8af9f656" containerID="07dc9cd2c22b1b6782dde34aa0b21c254e35cabadc778a9fd8e90792c4d68599" exitCode=0 Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.331833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" event={"ID":"e2d9d2f6-af70-446e-92b0-d41d8af9f656","Type":"ContainerDied","Data":"07dc9cd2c22b1b6782dde34aa0b21c254e35cabadc778a9fd8e90792c4d68599"} Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.578862 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.664502 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr7zm\" (UniqueName: \"kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm\") pod \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.664554 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts\") pod \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\" (UID: \"e2d9d2f6-af70-446e-92b0-d41d8af9f656\") " Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.664940 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2d9d2f6-af70-446e-92b0-d41d8af9f656" (UID: "e2d9d2f6-af70-446e-92b0-d41d8af9f656"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.670082 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm" (OuterVolumeSpecName: "kube-api-access-dr7zm") pod "e2d9d2f6-af70-446e-92b0-d41d8af9f656" (UID: "e2d9d2f6-af70-446e-92b0-d41d8af9f656"). InnerVolumeSpecName "kube-api-access-dr7zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.767594 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr7zm\" (UniqueName: \"kubernetes.io/projected/e2d9d2f6-af70-446e-92b0-d41d8af9f656-kube-api-access-dr7zm\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:24 crc kubenswrapper[4747]: I1128 13:51:24.767635 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2d9d2f6-af70-446e-92b0-d41d8af9f656-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:25 crc kubenswrapper[4747]: I1128 13:51:25.343406 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" event={"ID":"e2d9d2f6-af70-446e-92b0-d41d8af9f656","Type":"ContainerDied","Data":"804d6956ac41638d0e6837f530a8b952e2613a7cd59b407ead1dd9cfb1e02d4d"} Nov 28 13:51:25 crc kubenswrapper[4747]: I1128 13:51:25.343980 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="804d6956ac41638d0e6837f530a8b952e2613a7cd59b407ead1dd9cfb1e02d4d" Nov 28 13:51:25 crc kubenswrapper[4747]: I1128 13:51:25.343482 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5932-account-delete-9hbzb" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.350621 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" containerID="e0b19e0f614a81d9de7af066618587997d885bc312037114741abbc557d2a1c3" exitCode=0 Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.350658 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" event={"ID":"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119","Type":"ContainerDied","Data":"e0b19e0f614a81d9de7af066618587997d885bc312037114741abbc557d2a1c3"} Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.502349 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.594662 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts\") pod \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.594724 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkdzk\" (UniqueName: \"kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk\") pod \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.594825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data\") pod \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.594893 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys\") pod \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.594924 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys\") pod \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\" (UID: \"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119\") " Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.602392 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" (UID: "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.603547 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" (UID: "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.603589 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk" (OuterVolumeSpecName: "kube-api-access-rkdzk") pod "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" (UID: "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119"). InnerVolumeSpecName "kube-api-access-rkdzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.603700 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts" (OuterVolumeSpecName: "scripts") pod "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" (UID: "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.612637 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data" (OuterVolumeSpecName: "config-data") pod "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" (UID: "bc0d1e57-6488-4fd6-bbd8-16ae6be6a119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.697183 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.697230 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.697243 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.697254 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkdzk\" (UniqueName: \"kubernetes.io/projected/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-kube-api-access-rkdzk\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:26 crc kubenswrapper[4747]: I1128 13:51:26.697268 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.359029 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" event={"ID":"bc0d1e57-6488-4fd6-bbd8-16ae6be6a119","Type":"ContainerDied","Data":"52389b9070e3decad73e78aebf1e9a31618fb8b127b68ff944e5ca9a664da3b9"} Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.359093 4747 scope.go:117] "RemoveContainer" containerID="e0b19e0f614a81d9de7af066618587997d885bc312037114741abbc557d2a1c3" Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.359287 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66465fdcd5-dvr7p" Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.403345 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.409286 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-66465fdcd5-dvr7p"] Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.562437 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone5932-account-delete-9hbzb"] Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.570266 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone5932-account-delete-9hbzb"] Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.648849 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" path="/var/lib/kubelet/pods/bc0d1e57-6488-4fd6-bbd8-16ae6be6a119/volumes" Nov 28 13:51:27 crc kubenswrapper[4747]: I1128 13:51:27.649355 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2d9d2f6-af70-446e-92b0-d41d8af9f656" path="/var/lib/kubelet/pods/e2d9d2f6-af70-446e-92b0-d41d8af9f656/volumes" Nov 28 13:51:38 crc kubenswrapper[4747]: I1128 13:51:38.709393 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:51:38 crc kubenswrapper[4747]: I1128 13:51:38.715534 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:51:38 crc kubenswrapper[4747]: I1128 13:51:38.720816 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:51:38 crc kubenswrapper[4747]: I1128 13:51:38.866058 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="galera" containerID="cri-o://8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59" gracePeriod=30 Nov 28 13:51:39 crc kubenswrapper[4747]: I1128 13:51:39.519397 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:51:39 crc kubenswrapper[4747]: I1128 13:51:39.519597 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" containerName="memcached" containerID="cri-o://519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d" gracePeriod=30 Nov 28 13:51:39 crc kubenswrapper[4747]: I1128 13:51:39.995325 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.052344 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.096825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.096914 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.096953 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.096980 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.097015 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpmzz\" (UniqueName: \"kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.097035 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default\") pod \"625975dd-71a7-40d7-b99b-7204545ab2d5\" (UID: \"625975dd-71a7-40d7-b99b-7204545ab2d5\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.097646 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.098296 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.098289 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.100325 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.110129 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz" (OuterVolumeSpecName: "kube-api-access-rpmzz") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "kube-api-access-rpmzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.113101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "625975dd-71a7-40d7-b99b-7204545ab2d5" (UID: "625975dd-71a7-40d7-b99b-7204545ab2d5"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198170 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198213 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpmzz\" (UniqueName: \"kubernetes.io/projected/625975dd-71a7-40d7-b99b-7204545ab2d5-kube-api-access-rpmzz\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198247 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198260 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/625975dd-71a7-40d7-b99b-7204545ab2d5-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198273 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/625975dd-71a7-40d7-b99b-7204545ab2d5-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.198307 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.211839 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.299441 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.409132 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.453421 4747 generic.go:334] "Generic (PLEG): container finished" podID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerID="8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59" exitCode=0 Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.453491 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.453522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerDied","Data":"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59"} Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.453583 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"625975dd-71a7-40d7-b99b-7204545ab2d5","Type":"ContainerDied","Data":"1260e5c731f919f465450128a6b38ea5c3c7acdb87cb964711d72145558c854e"} Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.453618 4747 scope.go:117] "RemoveContainer" containerID="8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.490375 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.503476 4747 scope.go:117] "RemoveContainer" containerID="3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.518302 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.534409 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="rabbitmq" containerID="cri-o://4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9" gracePeriod=604800 Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.553163 4747 scope.go:117] "RemoveContainer" containerID="8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59" Nov 28 13:51:40 crc kubenswrapper[4747]: E1128 13:51:40.553595 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59\": container with ID starting with 8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59 not found: ID does not exist" containerID="8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.553636 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59"} err="failed to get container status \"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59\": rpc error: code = NotFound desc = could not find container \"8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59\": container with ID starting with 8a4e4bee6655349b81e347dcf7f551359993a64a79f88e5f50d8f7b2b909ce59 not found: ID does not exist" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.553664 4747 scope.go:117] "RemoveContainer" containerID="3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9" Nov 28 13:51:40 crc kubenswrapper[4747]: E1128 13:51:40.554617 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9\": container with ID starting with 3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9 not found: ID does not exist" containerID="3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.554643 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9"} err="failed to get container status \"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9\": rpc error: code = NotFound desc = could not find container \"3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9\": container with ID starting with 3749b854e1e1cdaaf69a70f95667183dbe6d149990a9b503489367d4ec955de9 not found: ID does not exist" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.878735 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.898081 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="galera" containerID="cri-o://43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" gracePeriod=28 Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.909263 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data\") pod \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.909310 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config\") pod \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.909376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq5w4\" (UniqueName: \"kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4\") pod \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\" (UID: \"b2e6a2c7-dee9-40e2-a7fa-78038c271647\") " Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.911084 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data" (OuterVolumeSpecName: "config-data") pod "b2e6a2c7-dee9-40e2-a7fa-78038c271647" (UID: "b2e6a2c7-dee9-40e2-a7fa-78038c271647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.911101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b2e6a2c7-dee9-40e2-a7fa-78038c271647" (UID: "b2e6a2c7-dee9-40e2-a7fa-78038c271647"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:40 crc kubenswrapper[4747]: I1128 13:51:40.914150 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4" (OuterVolumeSpecName: "kube-api-access-vq5w4") pod "b2e6a2c7-dee9-40e2-a7fa-78038c271647" (UID: "b2e6a2c7-dee9-40e2-a7fa-78038c271647"). InnerVolumeSpecName "kube-api-access-vq5w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.010331 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.010373 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.010390 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq5w4\" (UniqueName: \"kubernetes.io/projected/b2e6a2c7-dee9-40e2-a7fa-78038c271647-kube-api-access-vq5w4\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.283038 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.283652 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" podUID="8e64428a-5763-44ae-87c3-e45ba2c3a039" containerName="manager" containerID="cri-o://7cb94193d218edb947e0d91e36be5b76272f7596495c4c5388a8a911bdd00bcb" gracePeriod=10 Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.462689 4747 generic.go:334] "Generic (PLEG): container finished" podID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" containerID="519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d" exitCode=0 Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.462768 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b2e6a2c7-dee9-40e2-a7fa-78038c271647","Type":"ContainerDied","Data":"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d"} Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.462801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b2e6a2c7-dee9-40e2-a7fa-78038c271647","Type":"ContainerDied","Data":"94299ed8f5cdecc1090f3a9d45eb976bd6228f5211203e6294ab4e5022add22f"} Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.462814 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.462822 4747 scope.go:117] "RemoveContainer" containerID="519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.464524 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e64428a-5763-44ae-87c3-e45ba2c3a039" containerID="7cb94193d218edb947e0d91e36be5b76272f7596495c4c5388a8a911bdd00bcb" exitCode=0 Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.464584 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" event={"ID":"8e64428a-5763-44ae-87c3-e45ba2c3a039","Type":"ContainerDied","Data":"7cb94193d218edb947e0d91e36be5b76272f7596495c4c5388a8a911bdd00bcb"} Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.499504 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.499789 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-g2f9t" podUID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" containerName="registry-server" containerID="cri-o://9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b" gracePeriod=30 Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.502341 4747 scope.go:117] "RemoveContainer" containerID="519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d" Nov 28 13:51:41 crc kubenswrapper[4747]: E1128 13:51:41.504780 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d\": container with ID starting with 519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d not found: ID does not exist" containerID="519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.504840 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d"} err="failed to get container status \"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d\": rpc error: code = NotFound desc = could not find container \"519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d\": container with ID starting with 519479503fa160a94e8654ca37217f1d20a7f8bd84c3e7077e19bf728ea77e4d not found: ID does not exist" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.515047 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.528444 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.551716 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.557782 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/35d1e9597b2c42bb95435a0bf69df49724b44c9fe06a4a87331d9125effmgfm"] Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.654657 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" path="/var/lib/kubelet/pods/625975dd-71a7-40d7-b99b-7204545ab2d5/volumes" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.655393 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721907f7-badd-4ac4-aba2-b2915ea6a9cb" path="/var/lib/kubelet/pods/721907f7-badd-4ac4-aba2-b2915ea6a9cb/volumes" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.656146 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" path="/var/lib/kubelet/pods/b2e6a2c7-dee9-40e2-a7fa-78038c271647/volumes" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.809643 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.928492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert\") pod \"8e64428a-5763-44ae-87c3-e45ba2c3a039\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.928552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpbgc\" (UniqueName: \"kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc\") pod \"8e64428a-5763-44ae-87c3-e45ba2c3a039\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.928578 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert\") pod \"8e64428a-5763-44ae-87c3-e45ba2c3a039\" (UID: \"8e64428a-5763-44ae-87c3-e45ba2c3a039\") " Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.935644 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "8e64428a-5763-44ae-87c3-e45ba2c3a039" (UID: "8e64428a-5763-44ae-87c3-e45ba2c3a039"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.935661 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "8e64428a-5763-44ae-87c3-e45ba2c3a039" (UID: "8e64428a-5763-44ae-87c3-e45ba2c3a039"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.935731 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc" (OuterVolumeSpecName: "kube-api-access-tpbgc") pod "8e64428a-5763-44ae-87c3-e45ba2c3a039" (UID: "8e64428a-5763-44ae-87c3-e45ba2c3a039"). InnerVolumeSpecName "kube-api-access-tpbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:41 crc kubenswrapper[4747]: I1128 13:51:41.966963 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.030227 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.030250 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpbgc\" (UniqueName: \"kubernetes.io/projected/8e64428a-5763-44ae-87c3-e45ba2c3a039-kube-api-access-tpbgc\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.030261 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e64428a-5763-44ae-87c3-e45ba2c3a039-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.131077 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz5qd\" (UniqueName: \"kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd\") pod \"c8f39d39-8a82-4e51-9a4c-81d2476e5d42\" (UID: \"c8f39d39-8a82-4e51-9a4c-81d2476e5d42\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.134516 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd" (OuterVolumeSpecName: "kube-api-access-zz5qd") pod "c8f39d39-8a82-4e51-9a4c-81d2476e5d42" (UID: "c8f39d39-8a82-4e51-9a4c-81d2476e5d42"). InnerVolumeSpecName "kube-api-access-zz5qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.170346 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233060 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233098 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233570 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233623 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzbgx\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233854 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233944 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.233983 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie\") pod \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\" (UID: \"69406e1d-82c4-485d-aaf5-e7c8ead8dc40\") " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.234107 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.234431 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.234735 4747 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.234753 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.234763 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz5qd\" (UniqueName: \"kubernetes.io/projected/c8f39d39-8a82-4e51-9a4c-81d2476e5d42-kube-api-access-zz5qd\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.235526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.236334 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info" (OuterVolumeSpecName: "pod-info") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.242502 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx" (OuterVolumeSpecName: "kube-api-access-rzbgx") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "kube-api-access-rzbgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.242900 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.252374 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09" (OuterVolumeSpecName: "persistence") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "pvc-c999fa65-302c-4152-9883-43c6dbeb0c09". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.286549 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "69406e1d-82c4-485d-aaf5-e7c8ead8dc40" (UID: "69406e1d-82c4-485d-aaf5-e7c8ead8dc40"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.335966 4747 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-pod-info\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.336039 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") on node \"crc\" " Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.336115 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzbgx\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-kube-api-access-rzbgx\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.336130 4747 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.336139 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.336150 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69406e1d-82c4-485d-aaf5-e7c8ead8dc40-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.351125 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.351301 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c999fa65-302c-4152-9883-43c6dbeb0c09" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09") on node "crc" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.437306 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c999fa65-302c-4152-9883-43c6dbeb0c09\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.474449 4747 generic.go:334] "Generic (PLEG): container finished" podID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerID="4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9" exitCode=0 Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.474517 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerDied","Data":"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9"} Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.474549 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"69406e1d-82c4-485d-aaf5-e7c8ead8dc40","Type":"ContainerDied","Data":"1c755292d5a0ee7204948a3105058ca194406000ab108e552908244d27dd6359"} Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.474567 4747 scope.go:117] "RemoveContainer" containerID="4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.474661 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.477994 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" containerID="9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b" exitCode=0 Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.478275 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-g2f9t" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.478302 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-g2f9t" event={"ID":"c8f39d39-8a82-4e51-9a4c-81d2476e5d42","Type":"ContainerDied","Data":"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b"} Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.478908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-g2f9t" event={"ID":"c8f39d39-8a82-4e51-9a4c-81d2476e5d42","Type":"ContainerDied","Data":"839d7668a63cec36e123955c9b45b8a47232b98178e47d38414e9a8b180bcb30"} Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.480127 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" event={"ID":"8e64428a-5763-44ae-87c3-e45ba2c3a039","Type":"ContainerDied","Data":"ceee4b7d7d250a26816cc0e06323bab5f9788e1db8a6a83d7d88c0f0c7878684"} Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.480198 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.503451 4747 scope.go:117] "RemoveContainer" containerID="0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.508832 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.517199 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.529244 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.535575 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-589f96b8dd-d5xbd"] Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.541468 4747 scope.go:117] "RemoveContainer" containerID="4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9" Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.542279 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9\": container with ID starting with 4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9 not found: ID does not exist" containerID="4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.542309 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9"} err="failed to get container status \"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9\": rpc error: code = NotFound desc = could not find container \"4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9\": container with ID starting with 4eaf7aebd26e8859b126060eef02e5f7bd82df4ee6198aac39c238bf0409b9d9 not found: ID does not exist" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.542331 4747 scope.go:117] "RemoveContainer" containerID="0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.543394 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.543492 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d\": container with ID starting with 0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d not found: ID does not exist" containerID="0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.543516 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d"} err="failed to get container status \"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d\": rpc error: code = NotFound desc = could not find container \"0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d\": container with ID starting with 0472e28471cbe6d8a0e1d61f496ac367f1acb716be3524c11ce3dee3842ee25d not found: ID does not exist" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.543541 4747 scope.go:117] "RemoveContainer" containerID="9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.548420 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-g2f9t"] Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.558240 4747 scope.go:117] "RemoveContainer" containerID="9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b" Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.558641 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b\": container with ID starting with 9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b not found: ID does not exist" containerID="9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.558669 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b"} err="failed to get container status \"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b\": rpc error: code = NotFound desc = could not find container \"9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b\": container with ID starting with 9ed2074d9f7f46133c543e5a1491faa02d219a8bec95beb856a879971b80908b not found: ID does not exist" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.558691 4747 scope.go:117] "RemoveContainer" containerID="7cb94193d218edb947e0d91e36be5b76272f7596495c4c5388a8a911bdd00bcb" Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.562148 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.563809 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.565171 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 28 13:51:42 crc kubenswrapper[4747]: E1128 13:51:42.565239 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-1" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="galera" Nov 28 13:51:42 crc kubenswrapper[4747]: I1128 13:51:42.912645 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="galera" containerID="cri-o://5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706" gracePeriod=26 Nov 28 13:51:43 crc kubenswrapper[4747]: I1128 13:51:43.648460 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" path="/var/lib/kubelet/pods/69406e1d-82c4-485d-aaf5-e7c8ead8dc40/volumes" Nov 28 13:51:43 crc kubenswrapper[4747]: I1128 13:51:43.649186 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e64428a-5763-44ae-87c3-e45ba2c3a039" path="/var/lib/kubelet/pods/8e64428a-5763-44ae-87c3-e45ba2c3a039/volumes" Nov 28 13:51:43 crc kubenswrapper[4747]: I1128 13:51:43.649587 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" path="/var/lib/kubelet/pods/c8f39d39-8a82-4e51-9a4c-81d2476e5d42/volumes" Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.210963 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.211235 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="manager" containerID="cri-o://9fef260230f3c32bb0b4fad410ac9ef66f2982fdc90242d88575b04eabf519a4" gracePeriod=10 Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.211575 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="kube-rbac-proxy" containerID="cri-o://cee35a56a70d10ab163196f295f134e068c85da00169c87824895f0ee1fa2bc7" gracePeriod=10 Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.477018 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.477599 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-psl9q" podUID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" containerName="registry-server" containerID="cri-o://7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35" gracePeriod=30 Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.508184 4747 generic.go:334] "Generic (PLEG): container finished" podID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerID="cee35a56a70d10ab163196f295f134e068c85da00169c87824895f0ee1fa2bc7" exitCode=0 Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.508259 4747 generic.go:334] "Generic (PLEG): container finished" podID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerID="9fef260230f3c32bb0b4fad410ac9ef66f2982fdc90242d88575b04eabf519a4" exitCode=0 Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.508268 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerDied","Data":"cee35a56a70d10ab163196f295f134e068c85da00169c87824895f0ee1fa2bc7"} Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.508334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerDied","Data":"9fef260230f3c32bb0b4fad410ac9ef66f2982fdc90242d88575b04eabf519a4"} Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.509925 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw"] Nov 28 13:51:44 crc kubenswrapper[4747]: I1128 13:51:44.516726 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dx5wtw"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.027791 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079169 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079265 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079289 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg8m6\" (UniqueName: \"kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.079452 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated\") pod \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\" (UID: \"e54ebc7f-c262-4900-9cd3-76fd6280c5f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.080068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.080403 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.080641 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.080669 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.086356 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6" (OuterVolumeSpecName: "kube-api-access-gg8m6") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "kube-api-access-gg8m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.091596 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "e54ebc7f-c262-4900-9cd3-76fd6280c5f6" (UID: "e54ebc7f-c262-4900-9cd3-76fd6280c5f6"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.092887 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.150064 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181296 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181363 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181434 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftj24\" (UniqueName: \"kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181477 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181562 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181609 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default\") pod \"23bc6d14-d758-4423-9c06-37b5eeac59f6\" (UID: \"23bc6d14-d758-4423-9c06-37b5eeac59f6\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181901 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181927 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181941 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181952 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg8m6\" (UniqueName: \"kubernetes.io/projected/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-kube-api-access-gg8m6\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181979 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.181993 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54ebc7f-c262-4900-9cd3-76fd6280c5f6-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.182103 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.182318 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.182385 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.182654 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.186951 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24" (OuterVolumeSpecName: "kube-api-access-ftj24") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "kube-api-access-ftj24". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.194978 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.195728 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "23bc6d14-d758-4423-9c06-37b5eeac59f6" (UID: "23bc6d14-d758-4423-9c06-37b5eeac59f6"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283146 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmc25\" (UniqueName: \"kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25\") pod \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283221 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert\") pod \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert\") pod \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\" (UID: \"37483c76-950c-49c6-a4f3-aba8c5c8c41a\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283608 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283627 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283637 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283647 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23bc6d14-d758-4423-9c06-37b5eeac59f6-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283657 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23bc6d14-d758-4423-9c06-37b5eeac59f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283666 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftj24\" (UniqueName: \"kubernetes.io/projected/23bc6d14-d758-4423-9c06-37b5eeac59f6-kube-api-access-ftj24\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.283684 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.287708 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25" (OuterVolumeSpecName: "kube-api-access-fmc25") pod "37483c76-950c-49c6-a4f3-aba8c5c8c41a" (UID: "37483c76-950c-49c6-a4f3-aba8c5c8c41a"). InnerVolumeSpecName "kube-api-access-fmc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.287761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "37483c76-950c-49c6-a4f3-aba8c5c8c41a" (UID: "37483c76-950c-49c6-a4f3-aba8c5c8c41a"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.287776 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "37483c76-950c-49c6-a4f3-aba8c5c8c41a" (UID: "37483c76-950c-49c6-a4f3-aba8c5c8c41a"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.294288 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.383352 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.384598 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmc25\" (UniqueName: \"kubernetes.io/projected/37483c76-950c-49c6-a4f3-aba8c5c8c41a-kube-api-access-fmc25\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.384631 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.384648 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.384667 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37483c76-950c-49c6-a4f3-aba8c5c8c41a-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.485396 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m97xg\" (UniqueName: \"kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg\") pod \"0d062986-fe87-4371-8ead-8bdb1ebe83ac\" (UID: \"0d062986-fe87-4371-8ead-8bdb1ebe83ac\") " Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.489775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg" (OuterVolumeSpecName: "kube-api-access-m97xg") pod "0d062986-fe87-4371-8ead-8bdb1ebe83ac" (UID: "0d062986-fe87-4371-8ead-8bdb1ebe83ac"). InnerVolumeSpecName "kube-api-access-m97xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.515693 4747 generic.go:334] "Generic (PLEG): container finished" podID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" containerID="7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35" exitCode=0 Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.515741 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-psl9q" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.515761 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-psl9q" event={"ID":"0d062986-fe87-4371-8ead-8bdb1ebe83ac","Type":"ContainerDied","Data":"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.515818 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-psl9q" event={"ID":"0d062986-fe87-4371-8ead-8bdb1ebe83ac","Type":"ContainerDied","Data":"801b7f0bee06238b8ce2bc489a8347a2d9085dd59852c811bc80d5f053f81350"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.515836 4747 scope.go:117] "RemoveContainer" containerID="7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.519655 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.519662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj" event={"ID":"37483c76-950c-49c6-a4f3-aba8c5c8c41a","Type":"ContainerDied","Data":"3dbd5af8be3f50c07df09fcdba35c35f1bbd1e1ab612487718aadfdea60095ac"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.521960 4747 generic.go:334] "Generic (PLEG): container finished" podID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerID="5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706" exitCode=0 Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.522025 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerDied","Data":"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.522054 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"23bc6d14-d758-4423-9c06-37b5eeac59f6","Type":"ContainerDied","Data":"bc9934fc0971080cdc7e36ce94cbb85a6097d3be858b1db4284c864843f16907"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.522109 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.526353 4747 generic.go:334] "Generic (PLEG): container finished" podID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" exitCode=0 Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.526410 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerDied","Data":"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.526439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"e54ebc7f-c262-4900-9cd3-76fd6280c5f6","Type":"ContainerDied","Data":"deab8159325ca167c0dc432a991a26bcf884d337eca00d234e32228a1864267a"} Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.526495 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.535880 4747 scope.go:117] "RemoveContainer" containerID="7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35" Nov 28 13:51:45 crc kubenswrapper[4747]: E1128 13:51:45.536251 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35\": container with ID starting with 7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35 not found: ID does not exist" containerID="7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.536300 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35"} err="failed to get container status \"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35\": rpc error: code = NotFound desc = could not find container \"7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35\": container with ID starting with 7737f062618d3372457b80de71383ba837ec10bc98a8c828e909b7cc87950c35 not found: ID does not exist" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.536333 4747 scope.go:117] "RemoveContainer" containerID="cee35a56a70d10ab163196f295f134e068c85da00169c87824895f0ee1fa2bc7" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.559680 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.559984 4747 scope.go:117] "RemoveContainer" containerID="9fef260230f3c32bb0b4fad410ac9ef66f2982fdc90242d88575b04eabf519a4" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.563340 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-psl9q"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.572278 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.575736 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.586746 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m97xg\" (UniqueName: \"kubernetes.io/projected/0d062986-fe87-4371-8ead-8bdb1ebe83ac-kube-api-access-m97xg\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.590523 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.591347 4747 scope.go:117] "RemoveContainer" containerID="5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.594906 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.608658 4747 scope.go:117] "RemoveContainer" containerID="bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.609120 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.613335 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74f9c56665-z2jpj"] Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.625159 4747 scope.go:117] "RemoveContainer" containerID="5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706" Nov 28 13:51:45 crc kubenswrapper[4747]: E1128 13:51:45.625692 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706\": container with ID starting with 5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706 not found: ID does not exist" containerID="5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.625747 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706"} err="failed to get container status \"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706\": rpc error: code = NotFound desc = could not find container \"5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706\": container with ID starting with 5113e2b51b6d628fc132eefc78cdf28bb953bf842c13bda5d13957b99e15a706 not found: ID does not exist" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.625785 4747 scope.go:117] "RemoveContainer" containerID="bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e" Nov 28 13:51:45 crc kubenswrapper[4747]: E1128 13:51:45.626494 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e\": container with ID starting with bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e not found: ID does not exist" containerID="bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.626532 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e"} err="failed to get container status \"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e\": rpc error: code = NotFound desc = could not find container \"bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e\": container with ID starting with bf7472010721ff33d06f65a7d961057158b9104d1874759d3b2b01044e13d12e not found: ID does not exist" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.626575 4747 scope.go:117] "RemoveContainer" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.645710 4747 scope.go:117] "RemoveContainer" containerID="7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.649775 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" path="/var/lib/kubelet/pods/0d062986-fe87-4371-8ead-8bdb1ebe83ac/volumes" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.650427 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" path="/var/lib/kubelet/pods/23bc6d14-d758-4423-9c06-37b5eeac59f6/volumes" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.651122 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" path="/var/lib/kubelet/pods/37483c76-950c-49c6-a4f3-aba8c5c8c41a/volumes" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.652150 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ae9e4a-ff17-4203-95f7-de7d9690f798" path="/var/lib/kubelet/pods/c6ae9e4a-ff17-4203-95f7-de7d9690f798/volumes" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.652886 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" path="/var/lib/kubelet/pods/e54ebc7f-c262-4900-9cd3-76fd6280c5f6/volumes" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.664634 4747 scope.go:117] "RemoveContainer" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" Nov 28 13:51:45 crc kubenswrapper[4747]: E1128 13:51:45.665617 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91\": container with ID starting with 43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91 not found: ID does not exist" containerID="43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.665647 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91"} err="failed to get container status \"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91\": rpc error: code = NotFound desc = could not find container \"43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91\": container with ID starting with 43cc2dfd7d6d904856e20d41eea5bd4818f2fac2444c920adc3e72f2ea26dd91 not found: ID does not exist" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.665673 4747 scope.go:117] "RemoveContainer" containerID="7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d" Nov 28 13:51:45 crc kubenswrapper[4747]: E1128 13:51:45.665977 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d\": container with ID starting with 7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d not found: ID does not exist" containerID="7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d" Nov 28 13:51:45 crc kubenswrapper[4747]: I1128 13:51:45.666013 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d"} err="failed to get container status \"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d\": rpc error: code = NotFound desc = could not find container \"7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d\": container with ID starting with 7547e10f5adf5c1c8c1ec1a762f7814ff6ef85196bb5104787f51f7d4bc5188d not found: ID does not exist" Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.255065 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.255698 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" containerID="cri-o://783e4bfae6db909013827700759c2c336710875785103324248e08a5f6cc6fef" gracePeriod=10 Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.496369 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.496582 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-d94nr" podUID="ea3532b2-7349-4405-8425-5574724d1b9d" containerName="registry-server" containerID="cri-o://a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143" gracePeriod=30 Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.528581 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx"] Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.532018 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fvzhrx"] Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.534948 4747 generic.go:334] "Generic (PLEG): container finished" podID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerID="783e4bfae6db909013827700759c2c336710875785103324248e08a5f6cc6fef" exitCode=0 Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.535032 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerDied","Data":"783e4bfae6db909013827700759c2c336710875785103324248e08a5f6cc6fef"} Nov 28 13:51:46 crc kubenswrapper[4747]: I1128 13:51:46.535067 4747 scope.go:117] "RemoveContainer" containerID="67f8ee89b90ec1bc94d6660de3a0d7513231c0e92f0296d258674e7e2b8eb714" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.150830 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.207955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcwf8\" (UniqueName: \"kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8\") pod \"178b0a8c-9539-48dd-b483-09228bc22b6d\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.208270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert\") pod \"178b0a8c-9539-48dd-b483-09228bc22b6d\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.208367 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert\") pod \"178b0a8c-9539-48dd-b483-09228bc22b6d\" (UID: \"178b0a8c-9539-48dd-b483-09228bc22b6d\") " Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.212769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8" (OuterVolumeSpecName: "kube-api-access-wcwf8") pod "178b0a8c-9539-48dd-b483-09228bc22b6d" (UID: "178b0a8c-9539-48dd-b483-09228bc22b6d"). InnerVolumeSpecName "kube-api-access-wcwf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.213047 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "178b0a8c-9539-48dd-b483-09228bc22b6d" (UID: "178b0a8c-9539-48dd-b483-09228bc22b6d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.215447 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "178b0a8c-9539-48dd-b483-09228bc22b6d" (UID: "178b0a8c-9539-48dd-b483-09228bc22b6d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.310173 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.310201 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/178b0a8c-9539-48dd-b483-09228bc22b6d-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.310254 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcwf8\" (UniqueName: \"kubernetes.io/projected/178b0a8c-9539-48dd-b483-09228bc22b6d-kube-api-access-wcwf8\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.363324 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.433640 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm7nf\" (UniqueName: \"kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf\") pod \"ea3532b2-7349-4405-8425-5574724d1b9d\" (UID: \"ea3532b2-7349-4405-8425-5574724d1b9d\") " Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.443034 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf" (OuterVolumeSpecName: "kube-api-access-gm7nf") pod "ea3532b2-7349-4405-8425-5574724d1b9d" (UID: "ea3532b2-7349-4405-8425-5574724d1b9d"). InnerVolumeSpecName "kube-api-access-gm7nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.535478 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm7nf\" (UniqueName: \"kubernetes.io/projected/ea3532b2-7349-4405-8425-5574724d1b9d-kube-api-access-gm7nf\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.547059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" event={"ID":"178b0a8c-9539-48dd-b483-09228bc22b6d","Type":"ContainerDied","Data":"392e0d81152674bfe317ae9cdd63a92f4807933f5f63b5b35b1d5480844ef447"} Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.547122 4747 scope.go:117] "RemoveContainer" containerID="783e4bfae6db909013827700759c2c336710875785103324248e08a5f6cc6fef" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.547074 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.548684 4747 generic.go:334] "Generic (PLEG): container finished" podID="ea3532b2-7349-4405-8425-5574724d1b9d" containerID="a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143" exitCode=0 Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.548736 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d94nr" event={"ID":"ea3532b2-7349-4405-8425-5574724d1b9d","Type":"ContainerDied","Data":"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143"} Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.548762 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-d94nr" event={"ID":"ea3532b2-7349-4405-8425-5574724d1b9d","Type":"ContainerDied","Data":"028e2e192d4826a56db0637093c4d09feb9258240ccd49da7d702db2b9da73f7"} Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.548769 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-d94nr" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.563958 4747 scope.go:117] "RemoveContainer" containerID="a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.583137 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.587735 4747 scope.go:117] "RemoveContainer" containerID="a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143" Nov 28 13:51:47 crc kubenswrapper[4747]: E1128 13:51:47.588187 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143\": container with ID starting with a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143 not found: ID does not exist" containerID="a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.588344 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143"} err="failed to get container status \"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143\": rpc error: code = NotFound desc = could not find container \"a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143\": container with ID starting with a76e71a76ecf80ab4c4dac0e1059d5d2443ff5cdd55fa8fc9151cc7ecd644143 not found: ID does not exist" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.588368 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6d7db75cbb-kfl8b"] Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.597344 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.602195 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-d94nr"] Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.633149 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.633219 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.651569 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" path="/var/lib/kubelet/pods/178b0a8c-9539-48dd-b483-09228bc22b6d/volumes" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.654177 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b9b61c-6235-4ba0-9536-dd4bf65903b5" path="/var/lib/kubelet/pods/94b9b61c-6235-4ba0-9536-dd4bf65903b5/volumes" Nov 28 13:51:47 crc kubenswrapper[4747]: I1128 13:51:47.654869 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3532b2-7349-4405-8425-5574724d1b9d" path="/var/lib/kubelet/pods/ea3532b2-7349-4405-8425-5574724d1b9d/volumes" Nov 28 13:51:48 crc kubenswrapper[4747]: I1128 13:51:48.872749 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:51:48 crc kubenswrapper[4747]: I1128 13:51:48.873255 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" podUID="2d18bad5-470c-4359-b98f-0cfcabfe8694" containerName="operator" containerID="cri-o://da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a" gracePeriod=10 Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.152260 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.152466 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" podUID="40bbf056-1cae-46c3-94a8-2f74f517cf31" containerName="registry-server" containerID="cri-o://2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317" gracePeriod=30 Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.179545 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.188965 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qvxtj"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.245614 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.358879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr8p8\" (UniqueName: \"kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8\") pod \"2d18bad5-470c-4359-b98f-0cfcabfe8694\" (UID: \"2d18bad5-470c-4359-b98f-0cfcabfe8694\") " Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.364500 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8" (OuterVolumeSpecName: "kube-api-access-gr8p8") pod "2d18bad5-470c-4359-b98f-0cfcabfe8694" (UID: "2d18bad5-470c-4359-b98f-0cfcabfe8694"). InnerVolumeSpecName "kube-api-access-gr8p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.456725 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.459896 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr8p8\" (UniqueName: \"kubernetes.io/projected/2d18bad5-470c-4359-b98f-0cfcabfe8694-kube-api-access-gr8p8\") on node \"crc\" DevicePath \"\"" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.560477 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zl5d\" (UniqueName: \"kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d\") pod \"40bbf056-1cae-46c3-94a8-2f74f517cf31\" (UID: \"40bbf056-1cae-46c3-94a8-2f74f517cf31\") " Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.563682 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d" (OuterVolumeSpecName: "kube-api-access-4zl5d") pod "40bbf056-1cae-46c3-94a8-2f74f517cf31" (UID: "40bbf056-1cae-46c3-94a8-2f74f517cf31"). InnerVolumeSpecName "kube-api-access-4zl5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.564635 4747 generic.go:334] "Generic (PLEG): container finished" podID="2d18bad5-470c-4359-b98f-0cfcabfe8694" containerID="da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a" exitCode=0 Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.564699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" event={"ID":"2d18bad5-470c-4359-b98f-0cfcabfe8694","Type":"ContainerDied","Data":"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a"} Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.564720 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.564740 4747 scope.go:117] "RemoveContainer" containerID="da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.564726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm" event={"ID":"2d18bad5-470c-4359-b98f-0cfcabfe8694","Type":"ContainerDied","Data":"6ca07a2427eb6e3130432646a1e617579336a1ac5802e7481e60f08fab5ae71f"} Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.568423 4747 generic.go:334] "Generic (PLEG): container finished" podID="40bbf056-1cae-46c3-94a8-2f74f517cf31" containerID="2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317" exitCode=0 Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.568468 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" event={"ID":"40bbf056-1cae-46c3-94a8-2f74f517cf31","Type":"ContainerDied","Data":"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317"} Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.568477 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.568497 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tdss5" event={"ID":"40bbf056-1cae-46c3-94a8-2f74f517cf31","Type":"ContainerDied","Data":"81a838177dded7b9ddf89028cd1ccab66320adf644eb48403170d0accb490739"} Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.580976 4747 scope.go:117] "RemoveContainer" containerID="da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a" Nov 28 13:51:49 crc kubenswrapper[4747]: E1128 13:51:49.581430 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a\": container with ID starting with da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a not found: ID does not exist" containerID="da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.581467 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a"} err="failed to get container status \"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a\": rpc error: code = NotFound desc = could not find container \"da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a\": container with ID starting with da9f44745754adb2efd32ae0991db1fb5e128736c54ca9dbafff5347530e9b2a not found: ID does not exist" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.581497 4747 scope.go:117] "RemoveContainer" containerID="2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.600036 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.607759 4747 scope.go:117] "RemoveContainer" containerID="2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317" Nov 28 13:51:49 crc kubenswrapper[4747]: E1128 13:51:49.608537 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317\": container with ID starting with 2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317 not found: ID does not exist" containerID="2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.608612 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317"} err="failed to get container status \"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317\": rpc error: code = NotFound desc = could not find container \"2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317\": container with ID starting with 2303a6860eb93bb8c8845799980c41843eb5a4090bcd31d1c547a6ae87aad317 not found: ID does not exist" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.620139 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tdss5"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.624849 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.628055 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7sfzm"] Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.650774 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d18bad5-470c-4359-b98f-0cfcabfe8694" path="/var/lib/kubelet/pods/2d18bad5-470c-4359-b98f-0cfcabfe8694/volumes" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.651347 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bbf056-1cae-46c3-94a8-2f74f517cf31" path="/var/lib/kubelet/pods/40bbf056-1cae-46c3-94a8-2f74f517cf31/volumes" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.651855 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75538f50-4039-471c-842f-85941607c65e" path="/var/lib/kubelet/pods/75538f50-4039-471c-842f-85941607c65e/volumes" Nov 28 13:51:49 crc kubenswrapper[4747]: I1128 13:51:49.662546 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zl5d\" (UniqueName: \"kubernetes.io/projected/40bbf056-1cae-46c3-94a8-2f74f517cf31-kube-api-access-4zl5d\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.936064 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8fj9t/must-gather-cqv78"] Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937352 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d18bad5-470c-4359-b98f-0cfcabfe8694" containerName="operator" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937370 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d18bad5-470c-4359-b98f-0cfcabfe8694" containerName="operator" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937380 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937387 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937402 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d9d2f6-af70-446e-92b0-d41d8af9f656" containerName="mariadb-account-delete" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937415 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d9d2f6-af70-446e-92b0-d41d8af9f656" containerName="mariadb-account-delete" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937425 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937431 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937440 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="setup-container" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937447 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="setup-container" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937460 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="kube-rbac-proxy" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937466 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="kube-rbac-proxy" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937478 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937484 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937493 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937499 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937508 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937514 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="mysql-bootstrap" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937523 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937530 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937541 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" containerName="keystone-api" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937548 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" containerName="keystone-api" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937558 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" containerName="memcached" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937565 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" containerName="memcached" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937574 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937581 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937590 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937595 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937604 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937610 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937618 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937625 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937636 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3532b2-7349-4405-8425-5574724d1b9d" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937644 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3532b2-7349-4405-8425-5574724d1b9d" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937652 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="rabbitmq" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937660 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="rabbitmq" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937667 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bbf056-1cae-46c3-94a8-2f74f517cf31" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937673 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bbf056-1cae-46c3-94a8-2f74f517cf31" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937679 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937685 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: E1128 13:52:02.937693 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e64428a-5763-44ae-87c3-e45ba2c3a039" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937699 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e64428a-5763-44ae-87c3-e45ba2c3a039" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937841 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e6a2c7-dee9-40e2-a7fa-78038c271647" containerName="memcached" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937854 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bbf056-1cae-46c3-94a8-2f74f517cf31" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937864 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="625975dd-71a7-40d7-b99b-7204545ab2d5" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937874 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bc6d14-d758-4423-9c06-37b5eeac59f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937883 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e64428a-5763-44ae-87c3-e45ba2c3a039" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937891 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937901 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937910 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3532b2-7349-4405-8425-5574724d1b9d" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937918 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="178b0a8c-9539-48dd-b483-09228bc22b6d" containerName="manager" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937930 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d9d2f6-af70-446e-92b0-d41d8af9f656" containerName="mariadb-account-delete" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937938 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d18bad5-470c-4359-b98f-0cfcabfe8694" containerName="operator" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937948 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69406e1d-82c4-485d-aaf5-e7c8ead8dc40" containerName="rabbitmq" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937958 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d062986-fe87-4371-8ead-8bdb1ebe83ac" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937966 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="37483c76-950c-49c6-a4f3-aba8c5c8c41a" containerName="kube-rbac-proxy" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937974 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54ebc7f-c262-4900-9cd3-76fd6280c5f6" containerName="galera" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937983 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0d1e57-6488-4fd6-bbd8-16ae6be6a119" containerName="keystone-api" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.937990 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f39d39-8a82-4e51-9a4c-81d2476e5d42" containerName="registry-server" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.938833 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.950296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8fj9t"/"kube-root-ca.crt" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.951039 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8fj9t"/"openshift-service-ca.crt" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.953620 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8fj9t"/"default-dockercfg-cwb59" Nov 28 13:52:02 crc kubenswrapper[4747]: I1128 13:52:02.961941 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8fj9t/must-gather-cqv78"] Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.055608 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n777w\" (UniqueName: \"kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.055709 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.157325 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.157464 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n777w\" (UniqueName: \"kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.157960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.188132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n777w\" (UniqueName: \"kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w\") pod \"must-gather-cqv78\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.260339 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:52:03 crc kubenswrapper[4747]: I1128 13:52:03.678484 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8fj9t/must-gather-cqv78"] Nov 28 13:52:04 crc kubenswrapper[4747]: I1128 13:52:04.681605 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fj9t/must-gather-cqv78" event={"ID":"e92866e6-9889-472c-8721-5a7b89d1b1a8","Type":"ContainerStarted","Data":"5f99135b78a33ad9a2ddf4fe6d8c1d0b9be16c6b400bb777f3ada8ed2d557b48"} Nov 28 13:52:11 crc kubenswrapper[4747]: I1128 13:52:11.732032 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fj9t/must-gather-cqv78" event={"ID":"e92866e6-9889-472c-8721-5a7b89d1b1a8","Type":"ContainerStarted","Data":"7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e"} Nov 28 13:52:11 crc kubenswrapper[4747]: I1128 13:52:11.732637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fj9t/must-gather-cqv78" event={"ID":"e92866e6-9889-472c-8721-5a7b89d1b1a8","Type":"ContainerStarted","Data":"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354"} Nov 28 13:52:11 crc kubenswrapper[4747]: I1128 13:52:11.747750 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8fj9t/must-gather-cqv78" podStartSLOduration=2.45994055 podStartE2EDuration="9.747729083s" podCreationTimestamp="2025-11-28 13:52:02 +0000 UTC" firstStartedPulling="2025-11-28 13:52:03.690379264 +0000 UTC m=+1976.352861004" lastFinishedPulling="2025-11-28 13:52:10.978167787 +0000 UTC m=+1983.640649537" observedRunningTime="2025-11-28 13:52:11.744377579 +0000 UTC m=+1984.406859309" watchObservedRunningTime="2025-11-28 13:52:11.747729083 +0000 UTC m=+1984.410210813" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.476874 4747 scope.go:117] "RemoveContainer" containerID="6ac503c482c819ae5435a6f748c00cce9824025147e07bf86ca27fc6746d4d66" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.498542 4747 scope.go:117] "RemoveContainer" containerID="08917f44aced52183a52e119529e2ccea1cb11fdb3cdfc61c818ef39b0d713c9" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.520549 4747 scope.go:117] "RemoveContainer" containerID="523d2a6c9fec4e2b181e95a4324d357e6f29e48cc4dc06cc3a211cc59184d1fd" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.545282 4747 scope.go:117] "RemoveContainer" containerID="bb137ca1adf7e14e1f95e919f1b67891652d5a13194ec27b7ac6e5993d70ecef" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.565644 4747 scope.go:117] "RemoveContainer" containerID="85f8bfaa90ebdc70b3f219f16bc03f84b04c7b335222cdadc00a9780550f57c0" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.594093 4747 scope.go:117] "RemoveContainer" containerID="ffce513af174caf5f25e9ef78dfddd59ecade449acc0bf2440b313b6184e57ee" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.615469 4747 scope.go:117] "RemoveContainer" containerID="dd9c6ef8beb9b315cf7edd21e213068c5efc6ffcc4995b9a1ba243122de303a3" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.636141 4747 scope.go:117] "RemoveContainer" containerID="e275b3e03de128d226d1b08411facc8073c5977d0b5053adbbbf538c92e4e835" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.655780 4747 scope.go:117] "RemoveContainer" containerID="d573ccf19009ccc301aa54cb94f8e0e88b14c27df227b9f97373b1d45c740572" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.671795 4747 scope.go:117] "RemoveContainer" containerID="c00c8da01d7c8c5acd03b8d257df4f8996c95894d29cf735e5d8f4fe15cd1cdc" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.686707 4747 scope.go:117] "RemoveContainer" containerID="d64055eb6ce40a045ae28e56e44b66b5ff9a14a19933189dce9ebafa3c8bcad7" Nov 28 13:52:14 crc kubenswrapper[4747]: I1128 13:52:14.706583 4747 scope.go:117] "RemoveContainer" containerID="4f0e292ecff44610335134bb94e5a6fdb5e432feb44a6b8f722d58c62b85b4ca" Nov 28 13:52:17 crc kubenswrapper[4747]: I1128 13:52:17.633040 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:52:17 crc kubenswrapper[4747]: I1128 13:52:17.633106 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:52:17 crc kubenswrapper[4747]: I1128 13:52:17.633157 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:52:17 crc kubenswrapper[4747]: I1128 13:52:17.633731 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:52:17 crc kubenswrapper[4747]: I1128 13:52:17.633782 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192" gracePeriod=600 Nov 28 13:52:18 crc kubenswrapper[4747]: E1128 13:52:18.677843 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc55136c_24a8_4913_b8b9_afe93e54fd83.slice/crio-conmon-067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192.scope\": RecentStats: unable to find data in memory cache]" Nov 28 13:52:18 crc kubenswrapper[4747]: I1128 13:52:18.790243 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192" exitCode=0 Nov 28 13:52:18 crc kubenswrapper[4747]: I1128 13:52:18.790284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192"} Nov 28 13:52:18 crc kubenswrapper[4747]: I1128 13:52:18.790388 4747 scope.go:117] "RemoveContainer" containerID="d856b13b69d24fa4031751e4d56cc342a88e566082b38800528afa7576362161" Nov 28 13:52:19 crc kubenswrapper[4747]: I1128 13:52:19.800239 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b"} Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.334595 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.336467 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.345039 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.425505 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnfcw\" (UniqueName: \"kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.425588 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.425610 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.526487 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnfcw\" (UniqueName: \"kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.526602 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.526627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.527150 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.527218 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.545464 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnfcw\" (UniqueName: \"kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw\") pod \"redhat-operators-n6qkx\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:36 crc kubenswrapper[4747]: I1128 13:52:36.656928 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:37 crc kubenswrapper[4747]: I1128 13:52:37.152218 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:52:37 crc kubenswrapper[4747]: W1128 13:52:37.159328 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8abe32c_68a5_4fb3_8b11_1d10b3dff664.slice/crio-4fe651fd2461c85af0e91285533ac6340cac517a1ddeb0dda8b9ec3b80373f22 WatchSource:0}: Error finding container 4fe651fd2461c85af0e91285533ac6340cac517a1ddeb0dda8b9ec3b80373f22: Status 404 returned error can't find the container with id 4fe651fd2461c85af0e91285533ac6340cac517a1ddeb0dda8b9ec3b80373f22 Nov 28 13:52:37 crc kubenswrapper[4747]: I1128 13:52:37.923703 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerID="ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b" exitCode=0 Nov 28 13:52:37 crc kubenswrapper[4747]: I1128 13:52:37.923755 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerDied","Data":"ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b"} Nov 28 13:52:37 crc kubenswrapper[4747]: I1128 13:52:37.923993 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerStarted","Data":"4fe651fd2461c85af0e91285533ac6340cac517a1ddeb0dda8b9ec3b80373f22"} Nov 28 13:52:40 crc kubenswrapper[4747]: I1128 13:52:40.942879 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerStarted","Data":"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a"} Nov 28 13:52:41 crc kubenswrapper[4747]: I1128 13:52:41.950157 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerID="3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a" exitCode=0 Nov 28 13:52:41 crc kubenswrapper[4747]: I1128 13:52:41.950240 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerDied","Data":"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a"} Nov 28 13:52:46 crc kubenswrapper[4747]: I1128 13:52:46.979454 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerStarted","Data":"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807"} Nov 28 13:52:55 crc kubenswrapper[4747]: I1128 13:52:55.949854 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dw4lg_1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed/control-plane-machine-set-operator/0.log" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.083581 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hlq5_8a7ae477-ca56-4362-a334-a2915d71fdf0/kube-rbac-proxy/0.log" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.083800 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hlq5_8a7ae477-ca56-4362-a334-a2915d71fdf0/machine-api-operator/0.log" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.657753 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.657821 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.713865 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:56 crc kubenswrapper[4747]: I1128 13:52:56.736906 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n6qkx" podStartSLOduration=12.828353868 podStartE2EDuration="20.736840135s" podCreationTimestamp="2025-11-28 13:52:36 +0000 UTC" firstStartedPulling="2025-11-28 13:52:37.926250335 +0000 UTC m=+2010.588732065" lastFinishedPulling="2025-11-28 13:52:45.834736602 +0000 UTC m=+2018.497218332" observedRunningTime="2025-11-28 13:52:47.002369942 +0000 UTC m=+2019.664851672" watchObservedRunningTime="2025-11-28 13:52:56.736840135 +0000 UTC m=+2029.399321885" Nov 28 13:52:57 crc kubenswrapper[4747]: I1128 13:52:57.072363 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:57 crc kubenswrapper[4747]: I1128 13:52:57.111080 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.047454 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n6qkx" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="registry-server" containerID="cri-o://368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807" gracePeriod=2 Nov 28 13:52:59 crc kubenswrapper[4747]: E1128 13:52:59.189194 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8abe32c_68a5_4fb3_8b11_1d10b3dff664.slice/crio-conmon-368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807.scope\": RecentStats: unable to find data in memory cache]" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.391303 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.535038 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities\") pod \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.535121 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content\") pod \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.535174 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnfcw\" (UniqueName: \"kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw\") pod \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\" (UID: \"c8abe32c-68a5-4fb3-8b11-1d10b3dff664\") " Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.536035 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities" (OuterVolumeSpecName: "utilities") pod "c8abe32c-68a5-4fb3-8b11-1d10b3dff664" (UID: "c8abe32c-68a5-4fb3-8b11-1d10b3dff664"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.544317 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw" (OuterVolumeSpecName: "kube-api-access-bnfcw") pod "c8abe32c-68a5-4fb3-8b11-1d10b3dff664" (UID: "c8abe32c-68a5-4fb3-8b11-1d10b3dff664"). InnerVolumeSpecName "kube-api-access-bnfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.636609 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.636638 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnfcw\" (UniqueName: \"kubernetes.io/projected/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-kube-api-access-bnfcw\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.665070 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8abe32c-68a5-4fb3-8b11-1d10b3dff664" (UID: "c8abe32c-68a5-4fb3-8b11-1d10b3dff664"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:59 crc kubenswrapper[4747]: I1128 13:52:59.738188 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8abe32c-68a5-4fb3-8b11-1d10b3dff664-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.054912 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerID="368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807" exitCode=0 Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.054958 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerDied","Data":"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807"} Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.054986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6qkx" event={"ID":"c8abe32c-68a5-4fb3-8b11-1d10b3dff664","Type":"ContainerDied","Data":"4fe651fd2461c85af0e91285533ac6340cac517a1ddeb0dda8b9ec3b80373f22"} Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.054994 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6qkx" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.055007 4747 scope.go:117] "RemoveContainer" containerID="368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.070595 4747 scope.go:117] "RemoveContainer" containerID="3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.082078 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.099544 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n6qkx"] Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.102622 4747 scope.go:117] "RemoveContainer" containerID="ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.122019 4747 scope.go:117] "RemoveContainer" containerID="368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807" Nov 28 13:53:00 crc kubenswrapper[4747]: E1128 13:53:00.124699 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807\": container with ID starting with 368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807 not found: ID does not exist" containerID="368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.124758 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807"} err="failed to get container status \"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807\": rpc error: code = NotFound desc = could not find container \"368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807\": container with ID starting with 368d9abb66d2601f0692645e7757affcb6ca769af6fb33ed8a8b109e0e5d8807 not found: ID does not exist" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.124787 4747 scope.go:117] "RemoveContainer" containerID="3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a" Nov 28 13:53:00 crc kubenswrapper[4747]: E1128 13:53:00.126509 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a\": container with ID starting with 3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a not found: ID does not exist" containerID="3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.126561 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a"} err="failed to get container status \"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a\": rpc error: code = NotFound desc = could not find container \"3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a\": container with ID starting with 3ad619483dd85ebfa7e628081b6a07c77accab88227c7bb25fb15d901a378b9a not found: ID does not exist" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.126596 4747 scope.go:117] "RemoveContainer" containerID="ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b" Nov 28 13:53:00 crc kubenswrapper[4747]: E1128 13:53:00.128471 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b\": container with ID starting with ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b not found: ID does not exist" containerID="ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b" Nov 28 13:53:00 crc kubenswrapper[4747]: I1128 13:53:00.128528 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b"} err="failed to get container status \"ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b\": rpc error: code = NotFound desc = could not find container \"ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b\": container with ID starting with ec7fd723565ba00b7dde7cf3daab8aa2e80abdc9f8841694bb66d1458def889b not found: ID does not exist" Nov 28 13:53:01 crc kubenswrapper[4747]: I1128 13:53:01.649739 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" path="/var/lib/kubelet/pods/c8abe32c-68a5-4fb3-8b11-1d10b3dff664/volumes" Nov 28 13:53:12 crc kubenswrapper[4747]: I1128 13:53:12.652281 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pqb5v_534794d1-ed1e-4a3e-a094-ae6acb566bdc/kube-rbac-proxy/0.log" Nov 28 13:53:12 crc kubenswrapper[4747]: I1128 13:53:12.713136 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pqb5v_534794d1-ed1e-4a3e-a094-ae6acb566bdc/controller/0.log" Nov 28 13:53:12 crc kubenswrapper[4747]: I1128 13:53:12.899829 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:53:12 crc kubenswrapper[4747]: I1128 13:53:12.905273 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-r7gtv_2d74a4af-60f4-4a8e-9778-3be8fe163205/frr-k8s-webhook-server/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.119829 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.123424 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.126178 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.135104 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.314803 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.334666 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.364919 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.374586 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.487015 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.522464 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.537304 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.572778 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/controller/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.734491 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/frr-metrics/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.757135 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/kube-rbac-proxy/0.log" Nov 28 13:53:13 crc kubenswrapper[4747]: I1128 13:53:13.818161 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/kube-rbac-proxy-frr/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.009474 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/reloader/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.077945 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7456584b94-v95js_2f574e4d-330e-47db-85f9-48558244cccb/manager/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.105218 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/frr/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.216416 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58f86dcff-55hdw_57bc32d2-9f9f-4c31-9fdc-79027059691e/webhook-server/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.280475 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mt84d_108c08d8-4320-4227-ae03-933609bda4c0/kube-rbac-proxy/0.log" Nov 28 13:53:14 crc kubenswrapper[4747]: I1128 13:53:14.403591 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mt84d_108c08d8-4320-4227-ae03-933609bda4c0/speaker/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.103418 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.246083 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.267832 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.282642 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.427055 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.432492 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.459078 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/extract/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.578288 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.733871 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.751259 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.765719 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.913346 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:53:38 crc kubenswrapper[4747]: I1128 13:53:38.933943 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.140342 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.240361 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/registry-server/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.359691 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.371867 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.397077 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.501475 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.501661 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.732534 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bb4cp_f8ce1410-e45d-4cb9-a8b3-de758929de4b/marketplace-operator/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.838675 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:53:39 crc kubenswrapper[4747]: I1128 13:53:39.960489 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/registry-server/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.004362 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.054943 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.082683 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.224813 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.232267 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.352220 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/registry-server/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.413675 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.582658 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.582885 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.591474 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.737724 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:53:40 crc kubenswrapper[4747]: I1128 13:53:40.761430 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:53:41 crc kubenswrapper[4747]: I1128 13:53:41.269313 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/registry-server/0.log" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.537455 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:30 crc kubenswrapper[4747]: E1128 13:54:30.538343 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="extract-content" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.538361 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="extract-content" Nov 28 13:54:30 crc kubenswrapper[4747]: E1128 13:54:30.538374 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="extract-utilities" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.538382 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="extract-utilities" Nov 28 13:54:30 crc kubenswrapper[4747]: E1128 13:54:30.538394 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="registry-server" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.538402 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="registry-server" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.538545 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8abe32c-68a5-4fb3-8b11-1d10b3dff664" containerName="registry-server" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.539663 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.558504 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.569078 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.569156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.569188 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75k8\" (UniqueName: \"kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.670106 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.670147 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75k8\" (UniqueName: \"kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.670198 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.670584 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.670815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.692573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75k8\" (UniqueName: \"kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8\") pod \"redhat-marketplace-d2qmf\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:30 crc kubenswrapper[4747]: I1128 13:54:30.868338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:31 crc kubenswrapper[4747]: I1128 13:54:31.316725 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:31 crc kubenswrapper[4747]: I1128 13:54:31.645437 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerID="14ce2d86723f4ca317968576a43305286a0c09d63eb3710bfeafbf322fdfc0c2" exitCode=0 Nov 28 13:54:31 crc kubenswrapper[4747]: I1128 13:54:31.653757 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerDied","Data":"14ce2d86723f4ca317968576a43305286a0c09d63eb3710bfeafbf322fdfc0c2"} Nov 28 13:54:31 crc kubenswrapper[4747]: I1128 13:54:31.653849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerStarted","Data":"d2e96f301e0f7dbc5c772a3cba2f60ee076c579167bc7aab6c6308e7b2e41bcd"} Nov 28 13:54:34 crc kubenswrapper[4747]: I1128 13:54:34.677589 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerID="933b1d00f30c5053c9b84392d34e12a9e43e56f75ddbab623119ef0afdad9c45" exitCode=0 Nov 28 13:54:34 crc kubenswrapper[4747]: I1128 13:54:34.677755 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerDied","Data":"933b1d00f30c5053c9b84392d34e12a9e43e56f75ddbab623119ef0afdad9c45"} Nov 28 13:54:35 crc kubenswrapper[4747]: I1128 13:54:35.698432 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerStarted","Data":"da19d21c3d7e4b03f13e94729b4aa252849f54cf5243944b3ea11fc5edb0449e"} Nov 28 13:54:35 crc kubenswrapper[4747]: I1128 13:54:35.720298 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2qmf" podStartSLOduration=3.2344973870000002 podStartE2EDuration="5.720276513s" podCreationTimestamp="2025-11-28 13:54:30 +0000 UTC" firstStartedPulling="2025-11-28 13:54:32.657770885 +0000 UTC m=+2125.320252655" lastFinishedPulling="2025-11-28 13:54:35.143550001 +0000 UTC m=+2127.806031781" observedRunningTime="2025-11-28 13:54:35.718532599 +0000 UTC m=+2128.381014349" watchObservedRunningTime="2025-11-28 13:54:35.720276513 +0000 UTC m=+2128.382758243" Nov 28 13:54:40 crc kubenswrapper[4747]: I1128 13:54:40.869002 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:40 crc kubenswrapper[4747]: I1128 13:54:40.870124 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:40 crc kubenswrapper[4747]: I1128 13:54:40.912267 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:41 crc kubenswrapper[4747]: I1128 13:54:41.799786 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:41 crc kubenswrapper[4747]: I1128 13:54:41.866001 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:43 crc kubenswrapper[4747]: I1128 13:54:43.746137 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2qmf" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="registry-server" containerID="cri-o://da19d21c3d7e4b03f13e94729b4aa252849f54cf5243944b3ea11fc5edb0449e" gracePeriod=2 Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.761853 4747 generic.go:334] "Generic (PLEG): container finished" podID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerID="157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354" exitCode=0 Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.762289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fj9t/must-gather-cqv78" event={"ID":"e92866e6-9889-472c-8721-5a7b89d1b1a8","Type":"ContainerDied","Data":"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354"} Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.762831 4747 scope.go:117] "RemoveContainer" containerID="157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354" Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.768089 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerID="da19d21c3d7e4b03f13e94729b4aa252849f54cf5243944b3ea11fc5edb0449e" exitCode=0 Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.768168 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerDied","Data":"da19d21c3d7e4b03f13e94729b4aa252849f54cf5243944b3ea11fc5edb0449e"} Nov 28 13:54:45 crc kubenswrapper[4747]: I1128 13:54:45.924343 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.024108 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content\") pod \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.024172 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities\") pod \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.024324 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75k8\" (UniqueName: \"kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8\") pod \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\" (UID: \"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5\") " Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.026568 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities" (OuterVolumeSpecName: "utilities") pod "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" (UID: "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.036369 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8" (OuterVolumeSpecName: "kube-api-access-l75k8") pod "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" (UID: "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5"). InnerVolumeSpecName "kube-api-access-l75k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.061963 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" (UID: "b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.125937 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75k8\" (UniqueName: \"kubernetes.io/projected/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-kube-api-access-l75k8\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.125984 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.126000 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.164872 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fj9t_must-gather-cqv78_e92866e6-9889-472c-8721-5a7b89d1b1a8/gather/0.log" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.778097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qmf" event={"ID":"b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5","Type":"ContainerDied","Data":"d2e96f301e0f7dbc5c772a3cba2f60ee076c579167bc7aab6c6308e7b2e41bcd"} Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.778158 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qmf" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.778166 4747 scope.go:117] "RemoveContainer" containerID="da19d21c3d7e4b03f13e94729b4aa252849f54cf5243944b3ea11fc5edb0449e" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.802035 4747 scope.go:117] "RemoveContainer" containerID="933b1d00f30c5053c9b84392d34e12a9e43e56f75ddbab623119ef0afdad9c45" Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.816412 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.821718 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qmf"] Nov 28 13:54:46 crc kubenswrapper[4747]: I1128 13:54:46.841840 4747 scope.go:117] "RemoveContainer" containerID="14ce2d86723f4ca317968576a43305286a0c09d63eb3710bfeafbf322fdfc0c2" Nov 28 13:54:47 crc kubenswrapper[4747]: I1128 13:54:47.632697 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:54:47 crc kubenswrapper[4747]: I1128 13:54:47.633133 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:54:47 crc kubenswrapper[4747]: I1128 13:54:47.649441 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" path="/var/lib/kubelet/pods/b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5/volumes" Nov 28 13:54:52 crc kubenswrapper[4747]: I1128 13:54:52.859371 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8fj9t/must-gather-cqv78"] Nov 28 13:54:52 crc kubenswrapper[4747]: I1128 13:54:52.860024 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8fj9t/must-gather-cqv78" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="copy" containerID="cri-o://7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e" gracePeriod=2 Nov 28 13:54:52 crc kubenswrapper[4747]: I1128 13:54:52.866367 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8fj9t/must-gather-cqv78"] Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.203508 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fj9t_must-gather-cqv78_e92866e6-9889-472c-8721-5a7b89d1b1a8/copy/0.log" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.204611 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.320778 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n777w\" (UniqueName: \"kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w\") pod \"e92866e6-9889-472c-8721-5a7b89d1b1a8\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.320863 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output\") pod \"e92866e6-9889-472c-8721-5a7b89d1b1a8\" (UID: \"e92866e6-9889-472c-8721-5a7b89d1b1a8\") " Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.331076 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w" (OuterVolumeSpecName: "kube-api-access-n777w") pod "e92866e6-9889-472c-8721-5a7b89d1b1a8" (UID: "e92866e6-9889-472c-8721-5a7b89d1b1a8"). InnerVolumeSpecName "kube-api-access-n777w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.387279 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e92866e6-9889-472c-8721-5a7b89d1b1a8" (UID: "e92866e6-9889-472c-8721-5a7b89d1b1a8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.422213 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n777w\" (UniqueName: \"kubernetes.io/projected/e92866e6-9889-472c-8721-5a7b89d1b1a8-kube-api-access-n777w\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.422245 4747 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e92866e6-9889-472c-8721-5a7b89d1b1a8-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.647073 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" path="/var/lib/kubelet/pods/e92866e6-9889-472c-8721-5a7b89d1b1a8/volumes" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.824118 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fj9t_must-gather-cqv78_e92866e6-9889-472c-8721-5a7b89d1b1a8/copy/0.log" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.825598 4747 generic.go:334] "Generic (PLEG): container finished" podID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerID="7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e" exitCode=143 Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.825687 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fj9t/must-gather-cqv78" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.825717 4747 scope.go:117] "RemoveContainer" containerID="7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.843136 4747 scope.go:117] "RemoveContainer" containerID="157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.876420 4747 scope.go:117] "RemoveContainer" containerID="7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e" Nov 28 13:54:53 crc kubenswrapper[4747]: E1128 13:54:53.876802 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e\": container with ID starting with 7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e not found: ID does not exist" containerID="7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.876837 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e"} err="failed to get container status \"7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e\": rpc error: code = NotFound desc = could not find container \"7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e\": container with ID starting with 7ad4fb6c89bb217cdf211bcd5843862001aa8c396b783f2e3730e75ddafd883e not found: ID does not exist" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.876863 4747 scope.go:117] "RemoveContainer" containerID="157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354" Nov 28 13:54:53 crc kubenswrapper[4747]: E1128 13:54:53.877368 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354\": container with ID starting with 157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354 not found: ID does not exist" containerID="157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354" Nov 28 13:54:53 crc kubenswrapper[4747]: I1128 13:54:53.877437 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354"} err="failed to get container status \"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354\": rpc error: code = NotFound desc = could not find container \"157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354\": container with ID starting with 157ba1433729aeff88d41394c453294cbdd96608e3c27a21c1b095475a8f9354 not found: ID does not exist" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021475 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:54:59 crc kubenswrapper[4747]: E1128 13:54:59.021915 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="gather" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021926 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="gather" Nov 28 13:54:59 crc kubenswrapper[4747]: E1128 13:54:59.021940 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="extract-content" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021946 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="extract-content" Nov 28 13:54:59 crc kubenswrapper[4747]: E1128 13:54:59.021956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="copy" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021963 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="copy" Nov 28 13:54:59 crc kubenswrapper[4747]: E1128 13:54:59.021976 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="registry-server" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021982 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="registry-server" Nov 28 13:54:59 crc kubenswrapper[4747]: E1128 13:54:59.021990 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="extract-utilities" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.021996 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="extract-utilities" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.022091 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="copy" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.022106 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92866e6-9889-472c-8721-5a7b89d1b1a8" containerName="gather" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.022113 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9946fbf-6da5-4dad-9ef5-4e3f0f3e07e5" containerName="registry-server" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.022814 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.045375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.098770 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lf5q\" (UniqueName: \"kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.098837 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.098864 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.200102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.200166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.200260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lf5q\" (UniqueName: \"kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.201097 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.201340 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.224664 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lf5q\" (UniqueName: \"kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q\") pod \"certified-operators-b6wk4\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.359024 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.559267 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.886146 4747 generic.go:334] "Generic (PLEG): container finished" podID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerID="292283fd017b552779d22ab73b5c6398cf9ea7a55146f599c9837bf6c772e0b2" exitCode=0 Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.886193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerDied","Data":"292283fd017b552779d22ab73b5c6398cf9ea7a55146f599c9837bf6c772e0b2"} Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.886241 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerStarted","Data":"ffa68cb766b5d05ed598183c5e78fb58763e0163f14efb1c9bcf50d929ebae51"} Nov 28 13:54:59 crc kubenswrapper[4747]: I1128 13:54:59.888023 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:55:01 crc kubenswrapper[4747]: I1128 13:55:01.907158 4747 generic.go:334] "Generic (PLEG): container finished" podID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerID="8835e76586cad11e16c8ce816751c19d15f11df40eb5d9d26aecf88002be474a" exitCode=0 Nov 28 13:55:01 crc kubenswrapper[4747]: I1128 13:55:01.907296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerDied","Data":"8835e76586cad11e16c8ce816751c19d15f11df40eb5d9d26aecf88002be474a"} Nov 28 13:55:02 crc kubenswrapper[4747]: I1128 13:55:02.916570 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerStarted","Data":"1a8a1a50e60a843d5186641d379b6c83e2c2de0e39d224e9681473c316d72d25"} Nov 28 13:55:02 crc kubenswrapper[4747]: I1128 13:55:02.935484 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6wk4" podStartSLOduration=1.464006969 podStartE2EDuration="3.935459955s" podCreationTimestamp="2025-11-28 13:54:59 +0000 UTC" firstStartedPulling="2025-11-28 13:54:59.887669113 +0000 UTC m=+2152.550150843" lastFinishedPulling="2025-11-28 13:55:02.359122099 +0000 UTC m=+2155.021603829" observedRunningTime="2025-11-28 13:55:02.934379478 +0000 UTC m=+2155.596861248" watchObservedRunningTime="2025-11-28 13:55:02.935459955 +0000 UTC m=+2155.597941705" Nov 28 13:55:09 crc kubenswrapper[4747]: I1128 13:55:09.359501 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:09 crc kubenswrapper[4747]: I1128 13:55:09.359862 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:09 crc kubenswrapper[4747]: I1128 13:55:09.402178 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:09 crc kubenswrapper[4747]: I1128 13:55:09.994642 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:10 crc kubenswrapper[4747]: I1128 13:55:10.037303 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:55:11 crc kubenswrapper[4747]: I1128 13:55:11.974071 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6wk4" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="registry-server" containerID="cri-o://1a8a1a50e60a843d5186641d379b6c83e2c2de0e39d224e9681473c316d72d25" gracePeriod=2 Nov 28 13:55:13 crc kubenswrapper[4747]: I1128 13:55:13.993177 4747 generic.go:334] "Generic (PLEG): container finished" podID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerID="1a8a1a50e60a843d5186641d379b6c83e2c2de0e39d224e9681473c316d72d25" exitCode=0 Nov 28 13:55:13 crc kubenswrapper[4747]: I1128 13:55:13.993240 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerDied","Data":"1a8a1a50e60a843d5186641d379b6c83e2c2de0e39d224e9681473c316d72d25"} Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.442810 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.621570 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lf5q\" (UniqueName: \"kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q\") pod \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.621723 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities\") pod \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.621769 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content\") pod \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\" (UID: \"0daee8be-1ee1-4254-8d9b-bdc30dfad82f\") " Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.623693 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities" (OuterVolumeSpecName: "utilities") pod "0daee8be-1ee1-4254-8d9b-bdc30dfad82f" (UID: "0daee8be-1ee1-4254-8d9b-bdc30dfad82f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.628152 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q" (OuterVolumeSpecName: "kube-api-access-2lf5q") pod "0daee8be-1ee1-4254-8d9b-bdc30dfad82f" (UID: "0daee8be-1ee1-4254-8d9b-bdc30dfad82f"). InnerVolumeSpecName "kube-api-access-2lf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.685482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0daee8be-1ee1-4254-8d9b-bdc30dfad82f" (UID: "0daee8be-1ee1-4254-8d9b-bdc30dfad82f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.723396 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.723438 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:55:14 crc kubenswrapper[4747]: I1128 13:55:14.723454 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lf5q\" (UniqueName: \"kubernetes.io/projected/0daee8be-1ee1-4254-8d9b-bdc30dfad82f-kube-api-access-2lf5q\") on node \"crc\" DevicePath \"\"" Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.002327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6wk4" event={"ID":"0daee8be-1ee1-4254-8d9b-bdc30dfad82f","Type":"ContainerDied","Data":"ffa68cb766b5d05ed598183c5e78fb58763e0163f14efb1c9bcf50d929ebae51"} Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.002381 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6wk4" Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.002394 4747 scope.go:117] "RemoveContainer" containerID="1a8a1a50e60a843d5186641d379b6c83e2c2de0e39d224e9681473c316d72d25" Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.020761 4747 scope.go:117] "RemoveContainer" containerID="8835e76586cad11e16c8ce816751c19d15f11df40eb5d9d26aecf88002be474a" Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.039178 4747 scope.go:117] "RemoveContainer" containerID="292283fd017b552779d22ab73b5c6398cf9ea7a55146f599c9837bf6c772e0b2" Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.041833 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.045346 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6wk4"] Nov 28 13:55:15 crc kubenswrapper[4747]: I1128 13:55:15.655608 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" path="/var/lib/kubelet/pods/0daee8be-1ee1-4254-8d9b-bdc30dfad82f/volumes" Nov 28 13:55:17 crc kubenswrapper[4747]: I1128 13:55:17.633045 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:55:17 crc kubenswrapper[4747]: I1128 13:55:17.633139 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:55:47 crc kubenswrapper[4747]: I1128 13:55:47.633380 4747 patch_prober.go:28] interesting pod/machine-config-daemon-zbzpq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:55:47 crc kubenswrapper[4747]: I1128 13:55:47.634173 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:55:47 crc kubenswrapper[4747]: I1128 13:55:47.634276 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" Nov 28 13:55:47 crc kubenswrapper[4747]: I1128 13:55:47.635112 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b"} pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:55:47 crc kubenswrapper[4747]: I1128 13:55:47.635238 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerName="machine-config-daemon" containerID="cri-o://8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" gracePeriod=600 Nov 28 13:55:47 crc kubenswrapper[4747]: E1128 13:55:47.780003 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:55:48 crc kubenswrapper[4747]: I1128 13:55:48.247298 4747 generic.go:334] "Generic (PLEG): container finished" podID="bc55136c-24a8-4913-b8b9-afe93e54fd83" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" exitCode=0 Nov 28 13:55:48 crc kubenswrapper[4747]: I1128 13:55:48.247352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerDied","Data":"8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b"} Nov 28 13:55:48 crc kubenswrapper[4747]: I1128 13:55:48.247406 4747 scope.go:117] "RemoveContainer" containerID="067619c9f8cf3335b45d110ae252404003be27cf7880dcac06aace80a9501192" Nov 28 13:55:48 crc kubenswrapper[4747]: I1128 13:55:48.247963 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:55:48 crc kubenswrapper[4747]: E1128 13:55:48.248295 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:56:00 crc kubenswrapper[4747]: I1128 13:56:00.642005 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:56:00 crc kubenswrapper[4747]: E1128 13:56:00.643115 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:56:12 crc kubenswrapper[4747]: I1128 13:56:12.642745 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:56:12 crc kubenswrapper[4747]: E1128 13:56:12.643783 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:56:26 crc kubenswrapper[4747]: I1128 13:56:26.641608 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:56:26 crc kubenswrapper[4747]: E1128 13:56:26.642548 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:56:37 crc kubenswrapper[4747]: I1128 13:56:37.645689 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:56:37 crc kubenswrapper[4747]: E1128 13:56:37.647726 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:56:52 crc kubenswrapper[4747]: I1128 13:56:52.640949 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:56:52 crc kubenswrapper[4747]: E1128 13:56:52.641733 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:57:07 crc kubenswrapper[4747]: I1128 13:57:07.646463 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:57:07 crc kubenswrapper[4747]: E1128 13:57:07.647285 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.197098 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-88qbf/must-gather-4qccm"] Nov 28 13:57:11 crc kubenswrapper[4747]: E1128 13:57:11.197923 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="extract-utilities" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.197945 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="extract-utilities" Nov 28 13:57:11 crc kubenswrapper[4747]: E1128 13:57:11.197964 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="extract-content" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.197976 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="extract-content" Nov 28 13:57:11 crc kubenswrapper[4747]: E1128 13:57:11.197996 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="registry-server" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.198008 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="registry-server" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.198185 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0daee8be-1ee1-4254-8d9b-bdc30dfad82f" containerName="registry-server" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.199159 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.204288 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-88qbf"/"openshift-service-ca.crt" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.204358 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-88qbf"/"default-dockercfg-c2sgw" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.204522 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-88qbf"/"kube-root-ca.crt" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.210872 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-88qbf/must-gather-4qccm"] Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.290435 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.290525 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf5w\" (UniqueName: \"kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.391283 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf5w\" (UniqueName: \"kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.391372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.391882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.421821 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf5w\" (UniqueName: \"kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w\") pod \"must-gather-4qccm\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.519280 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:57:11 crc kubenswrapper[4747]: I1128 13:57:11.931900 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-88qbf/must-gather-4qccm"] Nov 28 13:57:12 crc kubenswrapper[4747]: I1128 13:57:12.801350 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-88qbf/must-gather-4qccm" event={"ID":"d87f4fa3-f348-4684-a5d8-90b563b3b771","Type":"ContainerStarted","Data":"b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6"} Nov 28 13:57:12 crc kubenswrapper[4747]: I1128 13:57:12.801673 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-88qbf/must-gather-4qccm" event={"ID":"d87f4fa3-f348-4684-a5d8-90b563b3b771","Type":"ContainerStarted","Data":"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1"} Nov 28 13:57:12 crc kubenswrapper[4747]: I1128 13:57:12.801686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-88qbf/must-gather-4qccm" event={"ID":"d87f4fa3-f348-4684-a5d8-90b563b3b771","Type":"ContainerStarted","Data":"2998f8b88598e651e11090ee9b47248ad9503c83ba14b2e2019f5ddfbdd1c70b"} Nov 28 13:57:12 crc kubenswrapper[4747]: I1128 13:57:12.821072 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-88qbf/must-gather-4qccm" podStartSLOduration=1.8210378600000001 podStartE2EDuration="1.82103786s" podCreationTimestamp="2025-11-28 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:57:12.815891409 +0000 UTC m=+2285.478373159" watchObservedRunningTime="2025-11-28 13:57:12.82103786 +0000 UTC m=+2285.483519630" Nov 28 13:57:21 crc kubenswrapper[4747]: I1128 13:57:21.641868 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:57:21 crc kubenswrapper[4747]: E1128 13:57:21.644072 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:57:33 crc kubenswrapper[4747]: I1128 13:57:33.641454 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:57:33 crc kubenswrapper[4747]: E1128 13:57:33.642447 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:57:47 crc kubenswrapper[4747]: I1128 13:57:47.644872 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:57:47 crc kubenswrapper[4747]: E1128 13:57:47.645780 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:57:54 crc kubenswrapper[4747]: I1128 13:57:54.330491 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dw4lg_1fd74f5c-e178-4dd8-b8a9-9f58ca6e26ed/control-plane-machine-set-operator/0.log" Nov 28 13:57:54 crc kubenswrapper[4747]: I1128 13:57:54.541544 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hlq5_8a7ae477-ca56-4362-a334-a2915d71fdf0/kube-rbac-proxy/0.log" Nov 28 13:57:54 crc kubenswrapper[4747]: I1128 13:57:54.545277 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hlq5_8a7ae477-ca56-4362-a334-a2915d71fdf0/machine-api-operator/0.log" Nov 28 13:57:59 crc kubenswrapper[4747]: I1128 13:57:59.641871 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:57:59 crc kubenswrapper[4747]: E1128 13:57:59.642455 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.341612 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pqb5v_534794d1-ed1e-4a3e-a094-ae6acb566bdc/kube-rbac-proxy/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.390866 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pqb5v_534794d1-ed1e-4a3e-a094-ae6acb566bdc/controller/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.514621 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-r7gtv_2d74a4af-60f4-4a8e-9778-3be8fe163205/frr-k8s-webhook-server/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.579600 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.725264 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.743684 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.758998 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.767222 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.906021 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.934105 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.943060 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:58:09 crc kubenswrapper[4747]: I1128 13:58:09.954976 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.106376 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-reloader/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.126044 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-frr-files/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.132780 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/cp-metrics/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.135684 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/controller/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.290503 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/kube-rbac-proxy/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.318280 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/frr-metrics/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.334718 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/kube-rbac-proxy-frr/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.490561 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/reloader/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.555053 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7456584b94-v95js_2f574e4d-330e-47db-85f9-48558244cccb/manager/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.746894 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58f86dcff-55hdw_57bc32d2-9f9f-4c31-9fdc-79027059691e/webhook-server/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.773725 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xvnqh_1ad103ad-636e-440b-924d-7b59aa875aa4/frr/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.856365 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mt84d_108c08d8-4320-4227-ae03-933609bda4c0/kube-rbac-proxy/0.log" Nov 28 13:58:10 crc kubenswrapper[4747]: I1128 13:58:10.977056 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mt84d_108c08d8-4320-4227-ae03-933609bda4c0/speaker/0.log" Nov 28 13:58:14 crc kubenswrapper[4747]: I1128 13:58:14.641837 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:58:14 crc kubenswrapper[4747]: E1128 13:58:14.642406 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:58:15 crc kubenswrapper[4747]: I1128 13:58:15.048237 4747 scope.go:117] "RemoveContainer" containerID="07dc9cd2c22b1b6782dde34aa0b21c254e35cabadc778a9fd8e90792c4d68599" Nov 28 13:58:26 crc kubenswrapper[4747]: I1128 13:58:26.641478 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:58:26 crc kubenswrapper[4747]: E1128 13:58:26.642530 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.120565 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.302003 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.330145 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.346032 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.496475 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/util/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.528572 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/extract/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.528816 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832pmsd_15555506-9bd3-401c-b26e-52cd441c0663/pull/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.657728 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.832895 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.850097 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:58:35 crc kubenswrapper[4747]: I1128 13:58:35.879569 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.038073 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-utilities/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.047134 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/extract-content/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.220052 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.377996 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.394667 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.415432 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.423759 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nz5c_b47935e1-b026-44cb-8e7c-518913365e82/registry-server/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.623370 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-content/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.624025 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/extract-utilities/0.log" Nov 28 13:58:36 crc kubenswrapper[4747]: I1128 13:58:36.826784 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bb4cp_f8ce1410-e45d-4cb9-a8b3-de758929de4b/marketplace-operator/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.003668 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.012170 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9vf94_10510a33-c8cf-4796-ac7c-26095e641b73/registry-server/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.120042 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.126159 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.166269 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.322928 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-utilities/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.358887 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.457398 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-62j9r_05d43de7-aac3-4012-b0d9-163896d07ffc/registry-server/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.546862 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.739982 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.740744 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.744104 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.937506 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-content/0.log" Nov 28 13:58:37 crc kubenswrapper[4747]: I1128 13:58:37.947847 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/extract-utilities/0.log" Nov 28 13:58:38 crc kubenswrapper[4747]: I1128 13:58:38.340028 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q49jb_d42b6c41-5b4a-4044-a66b-80fcdb3e9574/registry-server/0.log" Nov 28 13:58:40 crc kubenswrapper[4747]: I1128 13:58:40.641610 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:58:40 crc kubenswrapper[4747]: E1128 13:58:40.641922 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:58:53 crc kubenswrapper[4747]: I1128 13:58:53.642177 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:58:53 crc kubenswrapper[4747]: E1128 13:58:53.643036 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:05 crc kubenswrapper[4747]: I1128 13:59:05.646347 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:59:05 crc kubenswrapper[4747]: E1128 13:59:05.647039 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:17 crc kubenswrapper[4747]: I1128 13:59:17.648457 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:59:17 crc kubenswrapper[4747]: E1128 13:59:17.649528 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:30 crc kubenswrapper[4747]: I1128 13:59:30.641064 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:59:30 crc kubenswrapper[4747]: E1128 13:59:30.642068 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:41 crc kubenswrapper[4747]: I1128 13:59:41.650921 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:59:41 crc kubenswrapper[4747]: E1128 13:59:41.653531 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:45 crc kubenswrapper[4747]: I1128 13:59:45.729029 4747 generic.go:334] "Generic (PLEG): container finished" podID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerID="baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1" exitCode=0 Nov 28 13:59:45 crc kubenswrapper[4747]: I1128 13:59:45.729139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-88qbf/must-gather-4qccm" event={"ID":"d87f4fa3-f348-4684-a5d8-90b563b3b771","Type":"ContainerDied","Data":"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1"} Nov 28 13:59:45 crc kubenswrapper[4747]: I1128 13:59:45.730062 4747 scope.go:117] "RemoveContainer" containerID="baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1" Nov 28 13:59:45 crc kubenswrapper[4747]: I1128 13:59:45.868395 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-88qbf_must-gather-4qccm_d87f4fa3-f348-4684-a5d8-90b563b3b771/gather/0.log" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.215543 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-88qbf/must-gather-4qccm"] Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.216798 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-88qbf/must-gather-4qccm" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="copy" containerID="cri-o://b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6" gracePeriod=2 Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.219423 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-88qbf/must-gather-4qccm"] Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.579122 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-88qbf_must-gather-4qccm_d87f4fa3-f348-4684-a5d8-90b563b3b771/copy/0.log" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.579805 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.734397 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output\") pod \"d87f4fa3-f348-4684-a5d8-90b563b3b771\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.734570 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrf5w\" (UniqueName: \"kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w\") pod \"d87f4fa3-f348-4684-a5d8-90b563b3b771\" (UID: \"d87f4fa3-f348-4684-a5d8-90b563b3b771\") " Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.741331 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w" (OuterVolumeSpecName: "kube-api-access-lrf5w") pod "d87f4fa3-f348-4684-a5d8-90b563b3b771" (UID: "d87f4fa3-f348-4684-a5d8-90b563b3b771"). InnerVolumeSpecName "kube-api-access-lrf5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.781911 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-88qbf_must-gather-4qccm_d87f4fa3-f348-4684-a5d8-90b563b3b771/copy/0.log" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.782426 4747 generic.go:334] "Generic (PLEG): container finished" podID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerID="b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6" exitCode=143 Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.782482 4747 scope.go:117] "RemoveContainer" containerID="b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.782587 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-88qbf/must-gather-4qccm" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.801630 4747 scope.go:117] "RemoveContainer" containerID="baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.806400 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d87f4fa3-f348-4684-a5d8-90b563b3b771" (UID: "d87f4fa3-f348-4684-a5d8-90b563b3b771"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.839727 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrf5w\" (UniqueName: \"kubernetes.io/projected/d87f4fa3-f348-4684-a5d8-90b563b3b771-kube-api-access-lrf5w\") on node \"crc\" DevicePath \"\"" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.839768 4747 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d87f4fa3-f348-4684-a5d8-90b563b3b771-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.845656 4747 scope.go:117] "RemoveContainer" containerID="b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6" Nov 28 13:59:54 crc kubenswrapper[4747]: E1128 13:59:54.846891 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6\": container with ID starting with b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6 not found: ID does not exist" containerID="b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.846923 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6"} err="failed to get container status \"b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6\": rpc error: code = NotFound desc = could not find container \"b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6\": container with ID starting with b5d4f64aa688fb7376886cbc0477dab4d20ee5b51a41c1c81522f3446e2d34b6 not found: ID does not exist" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.846943 4747 scope.go:117] "RemoveContainer" containerID="baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1" Nov 28 13:59:54 crc kubenswrapper[4747]: E1128 13:59:54.847379 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1\": container with ID starting with baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1 not found: ID does not exist" containerID="baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1" Nov 28 13:59:54 crc kubenswrapper[4747]: I1128 13:59:54.847412 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1"} err="failed to get container status \"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1\": rpc error: code = NotFound desc = could not find container \"baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1\": container with ID starting with baad7bd06001a0c56718d2464fe1fd8e8bfc9e1ae9325a470f8a168ef50324f1 not found: ID does not exist" Nov 28 13:59:55 crc kubenswrapper[4747]: E1128 13:59:55.167626 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd87f4fa3_f348_4684_a5d8_90b563b3b771.slice/crio-2998f8b88598e651e11090ee9b47248ad9503c83ba14b2e2019f5ddfbdd1c70b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd87f4fa3_f348_4684_a5d8_90b563b3b771.slice\": RecentStats: unable to find data in memory cache]" Nov 28 13:59:55 crc kubenswrapper[4747]: I1128 13:59:55.648519 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" path="/var/lib/kubelet/pods/d87f4fa3-f348-4684-a5d8-90b563b3b771/volumes" Nov 28 13:59:56 crc kubenswrapper[4747]: I1128 13:59:56.641789 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 13:59:56 crc kubenswrapper[4747]: E1128 13:59:56.642027 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.562878 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 13:59:57 crc kubenswrapper[4747]: E1128 13:59:57.564093 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="copy" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.564113 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="copy" Nov 28 13:59:57 crc kubenswrapper[4747]: E1128 13:59:57.564142 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="gather" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.564150 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="gather" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.564301 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="copy" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.564325 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d87f4fa3-f348-4684-a5d8-90b563b3b771" containerName="gather" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.565337 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.577815 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.677221 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.677310 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7z7\" (UniqueName: \"kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.677391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.778825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.778876 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.778917 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7z7\" (UniqueName: \"kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.779581 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.779606 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.799192 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7z7\" (UniqueName: \"kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7\") pod \"community-operators-ps5d4\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:57 crc kubenswrapper[4747]: I1128 13:59:57.890119 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 13:59:58 crc kubenswrapper[4747]: I1128 13:59:58.201187 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 13:59:58 crc kubenswrapper[4747]: I1128 13:59:58.807583 4747 generic.go:334] "Generic (PLEG): container finished" podID="ceed3fc3-3432-42fa-a724-e1f266279d04" containerID="9d801e3448c1f9020e8a054c7fd8614cf992208a033953c55d3cac5e8a9c8a82" exitCode=0 Nov 28 13:59:58 crc kubenswrapper[4747]: I1128 13:59:58.807691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerDied","Data":"9d801e3448c1f9020e8a054c7fd8614cf992208a033953c55d3cac5e8a9c8a82"} Nov 28 13:59:58 crc kubenswrapper[4747]: I1128 13:59:58.807909 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerStarted","Data":"acbd4c9e632fba4880133048dfd2aa001b4b554ad4c8951522fc973dc916d422"} Nov 28 13:59:59 crc kubenswrapper[4747]: I1128 13:59:59.816238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerStarted","Data":"60d4b8bef9afd85f5b380387774843b4536c44658c72d7d09088cb53dc81f63b"} Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.136193 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4"] Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.136942 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.140367 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.142518 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.145194 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4"] Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.215661 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdnp\" (UniqueName: \"kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.215929 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.215999 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.320632 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.320773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.320855 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdnp\" (UniqueName: \"kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.328527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.332025 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.344031 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdnp\" (UniqueName: \"kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp\") pod \"collect-profiles-29405640-9qdv4\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.451742 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.703002 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4"] Nov 28 14:00:00 crc kubenswrapper[4747]: W1128 14:00:00.708614 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod489400fa_6bf9_4175_8d9e_c744efbb3019.slice/crio-5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4 WatchSource:0}: Error finding container 5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4: Status 404 returned error can't find the container with id 5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4 Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.829908 4747 generic.go:334] "Generic (PLEG): container finished" podID="ceed3fc3-3432-42fa-a724-e1f266279d04" containerID="60d4b8bef9afd85f5b380387774843b4536c44658c72d7d09088cb53dc81f63b" exitCode=0 Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.831452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerDied","Data":"60d4b8bef9afd85f5b380387774843b4536c44658c72d7d09088cb53dc81f63b"} Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.832584 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 14:00:00 crc kubenswrapper[4747]: I1128 14:00:00.834289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" event={"ID":"489400fa-6bf9-4175-8d9e-c744efbb3019","Type":"ContainerStarted","Data":"5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4"} Nov 28 14:00:01 crc kubenswrapper[4747]: I1128 14:00:01.844692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerStarted","Data":"18bc484d7338fbb711f65bf14492e5dcd497e84b49422494a24ce1a7b330350f"} Nov 28 14:00:01 crc kubenswrapper[4747]: I1128 14:00:01.846866 4747 generic.go:334] "Generic (PLEG): container finished" podID="489400fa-6bf9-4175-8d9e-c744efbb3019" containerID="efde510577b3d89e6886cbc0ab196a994382cc50ddc215de55accee319413245" exitCode=0 Nov 28 14:00:01 crc kubenswrapper[4747]: I1128 14:00:01.846913 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" event={"ID":"489400fa-6bf9-4175-8d9e-c744efbb3019","Type":"ContainerDied","Data":"efde510577b3d89e6886cbc0ab196a994382cc50ddc215de55accee319413245"} Nov 28 14:00:01 crc kubenswrapper[4747]: I1128 14:00:01.887650 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ps5d4" podStartSLOduration=2.080162058 podStartE2EDuration="4.887625869s" podCreationTimestamp="2025-11-28 13:59:57 +0000 UTC" firstStartedPulling="2025-11-28 13:59:58.809367101 +0000 UTC m=+2451.471848841" lastFinishedPulling="2025-11-28 14:00:01.616830922 +0000 UTC m=+2454.279312652" observedRunningTime="2025-11-28 14:00:01.869751913 +0000 UTC m=+2454.532233653" watchObservedRunningTime="2025-11-28 14:00:01.887625869 +0000 UTC m=+2454.550107599" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.078835 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.157864 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume\") pod \"489400fa-6bf9-4175-8d9e-c744efbb3019\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.157958 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume\") pod \"489400fa-6bf9-4175-8d9e-c744efbb3019\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.158053 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdnp\" (UniqueName: \"kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp\") pod \"489400fa-6bf9-4175-8d9e-c744efbb3019\" (UID: \"489400fa-6bf9-4175-8d9e-c744efbb3019\") " Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.158795 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume" (OuterVolumeSpecName: "config-volume") pod "489400fa-6bf9-4175-8d9e-c744efbb3019" (UID: "489400fa-6bf9-4175-8d9e-c744efbb3019"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.162949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp" (OuterVolumeSpecName: "kube-api-access-gmdnp") pod "489400fa-6bf9-4175-8d9e-c744efbb3019" (UID: "489400fa-6bf9-4175-8d9e-c744efbb3019"). InnerVolumeSpecName "kube-api-access-gmdnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.163026 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "489400fa-6bf9-4175-8d9e-c744efbb3019" (UID: "489400fa-6bf9-4175-8d9e-c744efbb3019"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.259177 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/489400fa-6bf9-4175-8d9e-c744efbb3019-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.259233 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/489400fa-6bf9-4175-8d9e-c744efbb3019-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.259245 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdnp\" (UniqueName: \"kubernetes.io/projected/489400fa-6bf9-4175-8d9e-c744efbb3019-kube-api-access-gmdnp\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.859403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" event={"ID":"489400fa-6bf9-4175-8d9e-c744efbb3019","Type":"ContainerDied","Data":"5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4"} Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.859646 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f04c73009db5562973639932e7661a39a0b2098ba30607fc391b1f0958016f4" Nov 28 14:00:03 crc kubenswrapper[4747]: I1128 14:00:03.859461 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-9qdv4" Nov 28 14:00:04 crc kubenswrapper[4747]: I1128 14:00:04.131299 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb"] Nov 28 14:00:04 crc kubenswrapper[4747]: I1128 14:00:04.139622 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-wgstb"] Nov 28 14:00:05 crc kubenswrapper[4747]: I1128 14:00:05.655950 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76" path="/var/lib/kubelet/pods/7aa0f709-8586-4f1a-8ff5-6d8dc46bcc76/volumes" Nov 28 14:00:07 crc kubenswrapper[4747]: I1128 14:00:07.890820 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:07 crc kubenswrapper[4747]: I1128 14:00:07.891306 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:07 crc kubenswrapper[4747]: I1128 14:00:07.952022 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:08 crc kubenswrapper[4747]: I1128 14:00:08.942779 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:08 crc kubenswrapper[4747]: I1128 14:00:08.992071 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 14:00:09 crc kubenswrapper[4747]: I1128 14:00:09.642105 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 14:00:09 crc kubenswrapper[4747]: E1128 14:00:09.642400 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 14:00:10 crc kubenswrapper[4747]: I1128 14:00:10.904753 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ps5d4" podUID="ceed3fc3-3432-42fa-a724-e1f266279d04" containerName="registry-server" containerID="cri-o://18bc484d7338fbb711f65bf14492e5dcd497e84b49422494a24ce1a7b330350f" gracePeriod=2 Nov 28 14:00:11 crc kubenswrapper[4747]: I1128 14:00:11.912690 4747 generic.go:334] "Generic (PLEG): container finished" podID="ceed3fc3-3432-42fa-a724-e1f266279d04" containerID="18bc484d7338fbb711f65bf14492e5dcd497e84b49422494a24ce1a7b330350f" exitCode=0 Nov 28 14:00:11 crc kubenswrapper[4747]: I1128 14:00:11.912784 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerDied","Data":"18bc484d7338fbb711f65bf14492e5dcd497e84b49422494a24ce1a7b330350f"} Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.526542 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.584192 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities\") pod \"ceed3fc3-3432-42fa-a724-e1f266279d04\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.584267 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content\") pod \"ceed3fc3-3432-42fa-a724-e1f266279d04\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.584353 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq7z7\" (UniqueName: \"kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7\") pod \"ceed3fc3-3432-42fa-a724-e1f266279d04\" (UID: \"ceed3fc3-3432-42fa-a724-e1f266279d04\") " Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.585324 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities" (OuterVolumeSpecName: "utilities") pod "ceed3fc3-3432-42fa-a724-e1f266279d04" (UID: "ceed3fc3-3432-42fa-a724-e1f266279d04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.589775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7" (OuterVolumeSpecName: "kube-api-access-pq7z7") pod "ceed3fc3-3432-42fa-a724-e1f266279d04" (UID: "ceed3fc3-3432-42fa-a724-e1f266279d04"). InnerVolumeSpecName "kube-api-access-pq7z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.643931 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceed3fc3-3432-42fa-a724-e1f266279d04" (UID: "ceed3fc3-3432-42fa-a724-e1f266279d04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.686291 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq7z7\" (UniqueName: \"kubernetes.io/projected/ceed3fc3-3432-42fa-a724-e1f266279d04-kube-api-access-pq7z7\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.686332 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.686347 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceed3fc3-3432-42fa-a724-e1f266279d04-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.925762 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps5d4" event={"ID":"ceed3fc3-3432-42fa-a724-e1f266279d04","Type":"ContainerDied","Data":"acbd4c9e632fba4880133048dfd2aa001b4b554ad4c8951522fc973dc916d422"} Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.925834 4747 scope.go:117] "RemoveContainer" containerID="18bc484d7338fbb711f65bf14492e5dcd497e84b49422494a24ce1a7b330350f" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.925832 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps5d4" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.953208 4747 scope.go:117] "RemoveContainer" containerID="60d4b8bef9afd85f5b380387774843b4536c44658c72d7d09088cb53dc81f63b" Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.963635 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.972223 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ps5d4"] Nov 28 14:00:12 crc kubenswrapper[4747]: I1128 14:00:12.997628 4747 scope.go:117] "RemoveContainer" containerID="9d801e3448c1f9020e8a054c7fd8614cf992208a033953c55d3cac5e8a9c8a82" Nov 28 14:00:13 crc kubenswrapper[4747]: I1128 14:00:13.654594 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceed3fc3-3432-42fa-a724-e1f266279d04" path="/var/lib/kubelet/pods/ceed3fc3-3432-42fa-a724-e1f266279d04/volumes" Nov 28 14:00:15 crc kubenswrapper[4747]: I1128 14:00:15.124589 4747 scope.go:117] "RemoveContainer" containerID="9c824ea5fbc84eaf1b3a5b1c2d9780d4645a486006b9788fc1e98ebbece5a8c0" Nov 28 14:00:21 crc kubenswrapper[4747]: I1128 14:00:21.641708 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 14:00:21 crc kubenswrapper[4747]: E1128 14:00:21.642750 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 14:00:35 crc kubenswrapper[4747]: I1128 14:00:35.641874 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 14:00:35 crc kubenswrapper[4747]: E1128 14:00:35.642451 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zbzpq_openshift-machine-config-operator(bc55136c-24a8-4913-b8b9-afe93e54fd83)\"" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" podUID="bc55136c-24a8-4913-b8b9-afe93e54fd83" Nov 28 14:00:49 crc kubenswrapper[4747]: I1128 14:00:49.641700 4747 scope.go:117] "RemoveContainer" containerID="8289b1f62b11019a75e253eaa7db6977c9b94db99bf2ced407bf50d99931e67b" Nov 28 14:00:50 crc kubenswrapper[4747]: I1128 14:00:50.167561 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zbzpq" event={"ID":"bc55136c-24a8-4913-b8b9-afe93e54fd83","Type":"ContainerStarted","Data":"65a12755faa4a2bf0d60265dd300d67c79e15af4efa4f72f8a2eea9200e10dd8"}